CN116828207A - Image processing method, device, computer equipment and storage medium - Google Patents

Image processing method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN116828207A
CN116828207A CN202210288791.7A CN202210288791A CN116828207A CN 116828207 A CN116828207 A CN 116828207A CN 202210288791 A CN202210288791 A CN 202210288791A CN 116828207 A CN116828207 A CN 116828207A
Authority
CN
China
Prior art keywords
vertex
target
texture
coordinates
normal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210288791.7A
Other languages
Chinese (zh)
Inventor
石福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210288791.7A priority Critical patent/CN116828207A/en
Publication of CN116828207A publication Critical patent/CN116828207A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Generation (AREA)

Abstract

The application provides an image processing method, an image processing device, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring first vertex coordinates of a plurality of vertexes of a water surface area in a target image, wherein the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image; determining target texture information of the plurality of vertexes based on dynamic information of the water surface area and a normal texture map, wherein the dynamic information comprises water flow speed and water flow time, the normal texture map is used for representing a corresponding relation between pixel points in the normal texture map and texture information, and the target texture information is used for representing textures of the vertexes; rendering the water surface area based on target texture information of the plurality of vertices. The method can enable the water surface area to display dynamic special effects, and the rendering effect is better.

Description

Image processing method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method, an image processing device, a computer device, and a storage medium.
Background
With the development of computer technology, navigation maps are widely applied to daily life of people. The navigation map has a water surface area, and the water surface area is presented in a rendering mode. How to render a more realistic water surface is the focus of research in the industry.
Currently, rendering of a water surface area in a navigation map is generally achieved in a solid color filling manner, which results in poor rendering effect.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, computer equipment and a storage medium, which can enable a water surface area to display a dynamic special effect and have a better rendering effect. The technical scheme is as follows:
in one aspect, there is provided an image processing method, the method including:
acquiring first vertex coordinates of a plurality of vertexes of a water surface area in a target image, wherein the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image;
determining target texture information of the plurality of vertexes based on dynamic information of the water surface area and a normal texture map, wherein the dynamic information comprises water flow speed and water flow time, the normal texture map is used for representing a corresponding relation between pixel points in the normal texture map and texture information, and the target texture information is used for representing textures of the vertexes;
Rendering the water surface area based on target texture information of the plurality of vertices.
In another aspect, there is provided an image processing apparatus including:
the device comprises an acquisition module, a first vertex coordinate acquisition module and a second vertex coordinate acquisition module, wherein the acquisition module is used for acquiring first vertex coordinates of a plurality of vertexes of a water surface area in a target image, the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image;
the determining module is used for determining target texture information of the plurality of vertexes based on dynamic information of the water surface area and a normal texture map, wherein the dynamic information comprises water flow speed and water flow time, the normal texture map is used for representing the corresponding relation between pixel points in the normal texture map and texture information, and the target texture information is used for representing textures of the vertexes;
and the rendering module is used for rendering the water surface area based on the target texture information of the plurality of vertexes.
In some embodiments, the determining module includes:
a first determining unit, configured to determine, for any vertex, at least two second vertex coordinates of the vertex based on at least two water flow speeds and water flow times in the dynamic information and first vertex coordinates of the vertex, where the second vertex coordinates are used to represent positions of the vertex after being deviated in the target image;
And a second determining unit configured to determine target texture information of the vertex based on the at least two second vertex coordinates and the normal texture map.
In some embodiments, the first determining unit includes:
a first determining subunit, configured to determine, based on a mapping relationship between the target image and a target map, a third vertex coordinate corresponding to the first vertex coordinate, where the third vertex coordinate is used to represent a position of the vertex in the target map;
a second determining subunit, configured to determine at least two fourth vertex coordinates of the vertex based on the third vertex coordinates, the at least two water flow speeds, and the water flow time, where the fourth vertex coordinates are used to represent a position of the vertex after being deviated in the target map;
the first determining subunit is further configured to determine, based on the mapping relationship, the at least two second vertex coordinates corresponding to the at least two fourth vertex coordinates.
In some embodiments, the second determining subunit is configured to determine at least two offsets of the vertex based on the at least two water flow rates and the water flow time; and respectively shifting the third vertex coordinates based on the at least two shifting amounts to obtain the at least two fourth vertex coordinates.
In some embodiments, the normal texture map includes normal vectors corresponding to pixel points;
the second determination unit includes:
an obtaining subunit, configured to obtain at least two normal vectors from the normal texture map based on the at least two second vertex coordinates, where the at least two normal vectors are normal vectors of at least two pixel points indicated by the at least two second vertex coordinates;
and a third determining subunit, configured to determine target texture information of the vertex based on the at least two normal vectors.
In some embodiments, the third determining subunit is configured to obtain an initial texture coordinate of the vertex, where the initial texture coordinate corresponds to the first vertex coordinate; adding the at least two normal vectors to obtain a target normal vector; determining target texture coordinates of the vertex based on the initial texture coordinates of the vertex and the target normal vector; and determining the target texture information based on the target texture coordinates.
In some embodiments, the third determining subunit is configured to offset the initial texture coordinate based on a modulus of the target normal vector to obtain the target texture coordinate.
In some embodiments, the rendering module is configured to obtain a plurality of target colors from a color texture map based on target texture information of the plurality of vertices, where the color texture map is used to represent a correspondence between texture information and colors; rendering the water surface area based on the plurality of target colors.
In another aspect, a computer device is provided, the computer device including a processor and a memory for storing at least one segment of a computer program, the at least one segment of a computer program being loaded and executed by the processor to implement an image processing method in an embodiment of the application.
In another aspect, a computer readable storage medium having stored therein at least one segment of a computer program loaded and executed by a processor to implement an image processing method as in an embodiment of the present application is provided.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer program code stored in a computer readable storage medium, the computer program code being read from the computer readable storage medium by a processor of a computer device, the computer program code being executed by the processor to cause the computer device to perform the image processing methods provided in the various aspects or various alternative implementations described above.
According to the scheme provided by the embodiment of the application, the target texture information of the plurality of vertexes is obtained through the dynamic information of the water surface area and the normal texture mapping in the target image, so that the target texture information can reflect the dynamic change of the plurality of vertexes based on the water flow speed in the water flow time, the water surface area is rendered based on the target texture information, the water surface area can display a dynamic special effect, and the rendering effect is better.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of an implementation environment of an image processing method according to an embodiment of the present application;
FIG. 2 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 3 is a flowchart of another image processing method provided in accordance with an embodiment of the present application;
FIG. 4 is a schematic illustration of a water surface area provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic illustration of a normal texture map provided in accordance with an embodiment of the present application;
FIG. 6 is a flow chart for rendering a water surface area provided in accordance with an embodiment of the present application;
fig. 7 is a schematic structural view of an image processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural view of another image processing apparatus provided according to an embodiment of the present application;
fig. 9 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be further described with reference to the accompanying drawings.
The terms "first," "second," and the like in this disclosure are used for distinguishing between similar elements or items having substantially the same function and function, and it should be understood that there is no logical or chronological dependency between the terms "first," "second," and "n," and that there is no limitation on the amount and order of execution.
The term "at least one" in the present application means one or more, and the meaning of "a plurality of" means two or more.
It should be noted that, the information (including, but not limited to, the device information of the target object, the personal information of the target object, etc.), the data (including, but not limited to, the data for analysis, the stored data, the displayed data, etc.), and the signals related to the present application are authorized by the target object or are fully authorized by all parties, and the collection, use and processing of the related data are required to comply with the related laws and regulations and standards of the related countries and regions. For example, the normal texture maps referred to in the present application are all obtained with sufficient authorization.
In order to facilitate understanding, terms related to the present application are explained below.
opengles: is a rendering engine that runs on the mobile end. There are other rendering engines including a unit 3d (unit three-dimensional) engine, a unit engine, and the like.
Texture: is a pattern in which the surface of the object exhibits rugged grooves and on the smooth surface of the object.
Normal texture mapping: is a map in which the direction of the normal is marked by the RGB color channels. Wherein the normal is a normal at each point of the concave-convex surface of the object.
Rendering pipeline: is the process of converting an object from a three-dimensional scene to an image in a two-dimensional scene, and finally displaying the image. The rendering pipeline includes a vertex shader and a fragment shader. In the embodiment of the application, the water surface in the target map is rendered into the target image.
Vertex shader: is a shader program running in a GPU (Graphic Processing Unit, graphics processor) for performing processing such as coordinate transformation on vertices of objects.
Fragment shader: is a loader program running in the GPU for determining the color of an object rendered onto a terminal screen from the texture information of the object, and may also be referred to as a pixel shader.
Sampling: is a way for the opengles to read the data content of the corresponding position of the texture by using the texture coordinates. In an embodiment of the present application, sampling is the process of obtaining a target color from a color texture map by texture coordinates.
The information display method provided by the embodiment of the application can be executed by computer equipment. In some embodiments, the computer device is a terminal or a server. In the following, a computer device is taken as an example of a terminal first, an implementation environment of an image processing method provided by an embodiment of the present application is introduced, and fig. 1 is a schematic diagram of an implementation environment of an image processing method provided by an embodiment of the present application. Referring to fig. 1, the implementation environment includes a terminal 101 and a server 102. The terminal 101 and the server 102 can be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
In some embodiments, terminal 101 is, but is not limited to, a smart phone, tablet, notebook, desktop, smart speaker, smart watch, smart voice-interactive device, smart home appliance, vehicle-mounted terminal, etc. The terminal 101 installs and runs an application program supporting image display. The application may be a navigation-type application, a multimedia-type application, or a game-type application, etc., to which embodiments of the present application are not limited. Taking the example that the application is a navigation class application, the navigation class application is provided with a target map. The terminal 101 is capable of displaying a target image corresponding to a partial region in the target map, the target image including a water surface region therein. The terminal 101 can render the displayed water surface area through the navigation-type application program.
In some embodiments, the server 102 is a stand-alone physical server, can be a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), and basic cloud computing services such as big data and artificial intelligence platforms. The server 102 is used to provide background services for applications that support image display. In some embodiments, the server 102 takes on primary computing work and the terminal 101 takes on secondary computing work; alternatively, the server 102 takes on secondary computing work and the terminal 101 takes on primary computing work; alternatively, a distributed computing architecture is used for collaborative computing between the server 102 and the terminal 101.
Fig. 2 is a flowchart of an image processing method according to an embodiment of the present application, and referring to fig. 2, in an embodiment of the present application, an example of execution by a terminal will be described. The image processing method includes the steps of:
201. the terminal obtains first vertex coordinates of a plurality of vertexes of the water surface area in the target image, wherein the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image.
In the embodiment of the application, the target image is the image currently displayed by the terminal. The target image includes a water surface region. The water surface area has a plurality of boundaries, and the intersection point of any two adjacent boundaries is the vertex of the water surface area. The water surface region includes a plurality of vertices. For any vertex, the vertex corresponds to a first vertex coordinate. The first vertex coordinate is a two-dimensional coordinate capable of representing the position of the vertex in the target image. The terminal can acquire first vertex coordinates of a plurality of vertices of a water surface area based on the currently displayed water surface area.
202. The terminal determines target texture information of a plurality of vertexes based on dynamic information of a water surface area and a normal texture map, wherein the dynamic information comprises water flow speed and water flow time, the normal texture map is used for representing a corresponding relation between pixel points in the normal texture map and texture information, and the target texture information is used for representing textures of the vertexes.
In the embodiment of the application, the terminal can generate offset on a plurality of vertexes of the water surface area based on the water flow speed and the water flow time in the dynamic information so as to simulate the dynamic effect represented by the dynamic information. The terminal can determine a corresponding plurality of pixel points from the normal texture map by the vertex coordinates of the plurality of vertices after the displacement based on the mapping relation between the vertex coordinates and the normal texture map. Then, the terminal determines target texture information of the plurality of vertices based on the plurality of pixel points. The target texture information can represent the texture of the vertex. Texture is used to represent the degree of heave in the water surface where the vertex is located.
203. The terminal renders the water surface area based on the target texture information of the plurality of vertices.
In the embodiment of the application, the terminal can determine the textures of the vertexes based on the target texture information of the vertexes. The texture can reflect information such as illumination intensity, shadow, vertex color and the like of the position of the vertex. The terminal is then able to render the water surface area based on the plurality of textures.
In the embodiment of the application, the target texture information of the plurality of vertexes is obtained through the dynamic information of the water surface area and the normal texture mapping in the target image, so that the target texture information can reflect the dynamic change of the plurality of vertexes based on the water flow speed in the water flow time, the water surface area is rendered based on the target texture information, the water surface area can display a dynamic special effect, and the rendering effect is better.
Fig. 3 is a flowchart of another image processing method according to an embodiment of the present application, referring to fig. 3, in the embodiment of the present application, an example of execution by a terminal will be described. The image processing method includes the steps of:
301. the terminal obtains first vertex coordinates of a plurality of vertexes of the water surface area in the target image, wherein the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image.
In the embodiment of the application, the terminal displays a target image, wherein the target image comprises a water surface area. The water surface area is the area where the water surface in the target map is located. The water surface area is surrounded by a plurality of boundaries. Any two adjacent boundaries in the plurality of boundaries are intersected, and an intersection point formed by the intersection of the boundaries is the vertex of the water surface area. The water surface region includes a plurality of vertices. The terminal can obtain first vertex coordinates of the plurality of vertices based on the positions of the plurality of vertices in the target image. Wherein the first vertex coordinate is a two-dimensional coordinate.
The target image corresponds to a partial area in the target map, and the target image includes a water surface area, where the water surface area may represent a complete water surface in the target map, or may represent a partial area in a water surface, which is not limited in the embodiment of the present application. The area corresponding to the target image can change in real time along with the change of the position of the terminal, and the water surface area in the target image also changes in real time along with the movement of the terminal. In the process of moving the terminal, under the condition that a water surface area in a target image represents a complete water surface, the boundary and the vertex of the water surface area in the target image are in one-to-one correspondence with the boundary and the vertex of the water surface in the target map. In the case where the water surface area in the target image represents a partial area of one water surface, that is, a partial area of the water surface in the target map is displayed on the terminal screen, the area of the water surface outside the screen has been cut out, and new vertices and boundaries are formed at this time.
For example, fig. 4 is a schematic diagram of a water surface area according to an embodiment of the present application. Referring to fig. 4, a partial water surface area of any water surface in the target map is displayed on the terminal screen. The partial water surface region is a water surface region in the target image. The water surface area includes a vertex a, a vertex b, a vertex c, a vertex d, a vertex e, and a vertex f. The area of the dashed line portion shown in fig. 4 represents the water surface outside the screen, which has been sheared off. The new vertex formed in the shearing process is vertex f, and the new boundary is the boundary between vertex a and vertex f.
In the embodiments of the present application, the water surface is taken as an example, and in some embodiments, the water surface may be replaced by a liquid surface of other liquid having fluidity, such as an oil liquid surface, an alcohol liquid surface, and the like, which is not limited in the embodiments of the present application.
302. For any vertex, at least two second vertex coordinates of the vertex are determined based on at least two water flow speeds and water flow times in the dynamic information and first vertex coordinates of the vertex, and the second vertex coordinates are used for representing positions of the vertex after the vertex is deviated in the target image.
In an embodiment of the application, the dynamic information includes at least two water flow rates and water flow times. The directions of the at least two water flow speeds are different, and the included angle between the directions of any two water flow speeds is not larger than a preset angle. The water flow rate can be represented by a two-dimensional vector, wherein the longer the modulus of the vector is, the faster the water flow rate is represented, and the shorter the modulus of the vector is, the slower the water flow rate is represented. The water flow time is a time interval from the rendering of the water surface area to the current time. For any vertex, the terminal can shift the vertex to a direction corresponding to the water flow speed based on the water flow speed and the water flow time. The vertex is shifted from a position represented by the first vertex coordinate to a position represented by the second vertex coordinate in the target image. According to the scheme provided by the embodiment of the application, the position of the vertex after the deflection in the target image is determined through the water flow speed and the water flow time, so that the dynamic change of the vertex can be reflected according to the position of the vertex after the deflection, and the subsequent rendering of a water surface area with a dynamic special effect is facilitated.
It should be noted that, the at least two water flow speeds and the water flow time are parameters in the target map, and the target map and the target image have a mapping relationship, that is, a plurality of vertices of the water surface area in the target image, and all have corresponding position points in the target map. The terminal can determine at least two second vertex coordinates of the vertex in the target image by the following steps 3021 to 3023, comprising:
3021. and the terminal determines a third vertex coordinate corresponding to the first vertex coordinate based on the mapping relation between the target image and the target map, wherein the third vertex coordinate is used for representing the position of the vertex in the target map.
Wherein, there is mapping relation between the target image and the target map. The position of the vertex in the target image corresponds to the position of the vertex in the target map. The terminal can determine a third vertex coordinate corresponding to the first vertex coordinate in the target map based on the mapping relationship and the first vertex coordinate.
3022. The terminal determines at least two fourth vertex coordinates of the vertex based on the third vertex coordinates, the at least two water flow speeds and the water flow time, wherein the fourth vertex coordinates are used for representing positions of the vertex after the vertex is deviated in the target map.
Wherein the terminal is capable of moving the vertex in the target map based on at least two water flow rates and water flow times such that the position of the vertex represented by the third vertex coordinate is shifted to the position represented by the fourth vertex coordinate. Accordingly, the terminal determines at least two offsets of the apex based on at least two water flow rates and water flow times. And then, the terminal respectively shifts the third vertex coordinates based on at least two shift amounts to obtain at least two fourth vertex coordinates.
The terminal is illustrated as shifting the apex based on two water flow rates and water flow time. The two velocities are a first water velocity and a second water velocity, respectively. The terminal can determine the two fourth vertex coordinates of the vertex based on the following formula one.
Equation one:
Velocity0*time+InputCoord*Scale1=NewPoint_1
Velocity1*time+InputCoord*Scale2=NewPoint_2
where Velocity0 represents a first water flow Velocity, velocity1 represents a second water flow Velocity, and time represents a water flow time. Inputcooord represents the latitude and longitude coordinates of the vertex. Scale1 and Scale2 are Scale parameters for converting the longitude and latitude coordinates of the vertex into a target map. Both inputcooord Scale1 and inputcooord Scale2 represent the third vertex coordinates. Since the coordinate value is large due to the target map, it is necessary to reduce the coordinate value, so that the inputcooord can be multiplied by the vectors Scale1 and Scale2 to reduce. Newpoint_1 and newpoint_2 each represent the fourth vertex coordinates.
It should be noted that the number of the substrates, the vector 0 and vector 1 parameters may be based on units of the vertex position of the water surface area, the embodiment of the present application is not limited thereto. Because the distance in the target map is obtained by shrinking the distance in the real world according to the preset proportion, the unit of data in the target map is smaller, if the coordinate with larger unit in the real world is directly converted into the coordinate with smaller unit, the numerical value of the coordinate becomes larger, the calculation is not facilitated, and the numerical value of the coordinate needs to be reduced. The terminal can reduce the numerical value of the coordinates through Scale1 and Scale2, and the Scale1 and Scale2 parameters can also be based on the unit of the vertex position of the water surface area, which is not limited by the embodiment of the present application. The terminal can adjust the corresponding parameters based on the difference of the data units. Scale1 and Scale2 may be the same or different, and embodiments of the present application are not limited in this regard.
For example, parameters set by the terminal are as follows:
Veclocity0:float2(0.016,-0.014);Veclocity1:float2(0.025,-0.03);
Scale1:float2(0.00072,0.00072);Scale2:float2(0.0003,0.0003)。
3023. the terminal determines at least two second vertex coordinates corresponding to the at least two fourth vertex coordinates based on the mapping relation.
The terminal determines at least two second vertex coordinates corresponding to the at least two fourth vertex coordinates from the target image based on a mapping relation between the target image and the target map.
According to the scheme provided by the embodiment of the application, the position of the vertex after the deflection in the target map is determined through the water flow speed and the water flow time, and then the position of the vertex after the deflection in the target image is determined, so that the dynamic change of the vertex in the target image can accurately reflect the dynamic change of the vertex in the target map, and the dynamic change of the vertex in the target image is more vivid, and the subsequent rendering of a water surface area with better effect is facilitated.
After determining the at least two second vertex coordinates of the vertex, the terminal can determine the target texture information of the vertex based on the at least two second vertex coordinates and the normal texture map, i.e. go on to steps 303 to 304.
303. The terminal obtains at least two normal vectors from the normal texture map based on at least two second vertex coordinates, wherein the at least two normal vectors are normal vectors of at least two pixel points indicated by the at least two second vertex coordinates, and the normal texture map comprises normal vectors corresponding to the pixel points.
In the embodiment of the application, the normal texture map can represent the corresponding relation between the pixel points in the normal texture map and the normal vector. The normal vector can be used to represent the degree of fluctuation of the water surface at the corresponding position in the water surface area indicated by the pixel point. The normal vector belongs to the texture information of the vertex, so the normal texture map can be used to represent the correspondence between pixel points in the normal texture map and the texture information. The pixels in the normal texture map correspond to pixels in the target image. Among the pixel points in the target image, there is a pixel point corresponding to the vertex of the water surface area. The terminal is capable of determining at least two pixel points indicated by the at least two second vertex coordinates from the normal texture map based on the at least two second vertex coordinates. Then, the terminal determines normal vectors corresponding to the at least two second vertex coordinates based on the at least two pixel points. For example, fig. 5 is a schematic diagram of a normal texture map provided according to an embodiment of the present application. Referring to fig. 5, the terminal can obtain normal vectors corresponding to the at least two second vertex coordinates based on the normal texture map.
In the process of rendering the water surface areas, the terminal can render the water surface areas based on a normal texture map; alternatively, the terminal may be capable of rendering the plurality of water surface areas based on a plurality of normal texture maps, respectively, which is not limited in the embodiment of the present application.
304. The terminal determines target texture information of the vertex based on at least two normal vectors.
In an embodiment of the present application, the target texture information is used to represent the texture of the vertex. The terminal can offset the initial texture coordinates of the vertex based on at least two normal vectors to obtain target texture coordinates. Then, the terminal determines the target texture information of the vertex through the target texture coordinates.
In some embodiments, the terminal can determine the target normal vector by at least two normal vectors, so that the initial texture coordinates of the vertex are offset based on the target normal vector to obtain target texture coordinates, and further determine the target texture information of the vertex. Accordingly, the process of determining the target texture information of the vertex by the terminal based on the at least two normal vectors can be implemented by the following steps 3041 to 3044, including:
3041. The terminal obtains initial texture coordinates of the vertex.
The texture coordinates are used for acquiring colors to render the water surface area. The initial texture coordinate (texture_chord) corresponds to the first vertex coordinate. The terminal can obtain initial texture coordinates for the vertex based on the first vertex coordinates.
3042. And the terminal adds at least two normal vectors to obtain a target normal vector.
After adding at least two normal vectors, the terminal changes the value range of the obtained target normal vector (AccumulateNormal). The terminal is capable of transforming the coordinates of the target normal vector such that the coordinate range of the target normal vector is between-1 and 1.
For example, the description will be given taking the number of normal vectors as two as an example. The value range of any normal vector is between 0 and 1. After the two normal vectors are added, the value range of the obtained target normal vector is between 0 and 2. The terminal can perform AccumulateNormal- =1.0 transformation on the coordinates of the target normal vector such that the coordinate range of the target normal vector is between-1 and 1.
3043. The terminal determines target texture coordinates of the vertices based on the initial texture coordinates of the vertices and the target normal vector.
The terminal shifts the initial texture coordinates based on the mode of the target normal vector to obtain target texture coordinates. The terminal can add the initial texture coordinate with the model of the target normal vector to obtain the target texture coordinate; alternatively, the terminal can also subtract the initial texture coordinates from the model of the target normal vector to obtain the target texture coordinates. For both modes, the terminal performs opposite offset to the initial texture coordinates.
3044. The terminal determines target texture information based on the target texture coordinates.
The terminal can determine texture target texture information of the vertex based on target texture coordinates of the vertex.
The terminal can repeatedly execute the steps 302 to 304 to obtain target texture information of a plurality of vertexes; alternatively, the terminal may simultaneously execute steps 302 to 304 through multithreading to obtain the target texture information of multiple vertices, which is not limited in the embodiment of the present application.
305. The terminal renders the water surface area based on the target texture information of the plurality of vertices.
In the embodiment of the application, the terminal can render the water surface area through the target texture information. Correspondingly, the terminal acquires a plurality of target colors from the color texture map based on target texture information of a plurality of vertexes; the terminal then renders the water surface area based on the plurality of target colors. The color texture map is used for representing the corresponding relation between texture information and colors. The terminal is capable of determining a plurality of pixel points in the water surface area based on the plurality of vertices. The terminal can acquire the target texture information of a plurality of pixel points by adopting a mode of acquiring the target texture information of a plurality of vertexes. The terminal can then obtain a plurality of target colors based on a plurality of target texture information corresponding to the water surface area to render the water surface area.
The target texture information comprises target texture coordinates, and the terminal can acquire target colors corresponding to the target texture coordinates from the color texture map. The target texture coordinate is a two-dimensional vector. Taking the target texture coordinate as (x, y) as an example, the terminal can perform the inverting processing on the y component of the target texture coordinate in the range of 0 to 1. And then, the terminal performs texture sampling on the color texture map based on the processed target texture coordinates to obtain target colors corresponding to the target texture coordinates.
In order to better understand the present solution, the overall procedure of the solution of the present application will be explained again. Fig. 6 is a flow chart for rendering a water surface area according to an embodiment of the present application. Referring to fig. 6, the terminal loads water system data of a water surface area in a target map from a hard disk into a memory. The terminal can acquire third vertex coordinates of the plurality of vertices of the water surface region based on the water system data. Then, the terminal pushes the water system data from the memory to the video memory, that is, the terminal pushes the water system data to a rendering pipeline running on the GPU. The rendering pipeline includes a vertex shader and a fragment shader. The terminal processes the water system data by the vertex shader, and can obtain the first vertex coordinates of the vertex based on the third vertex coordinates of the vertex. The terminal shifts the vertex from the position of the third vertex coordinate to the directions of at least two water flow speeds through the vertex shader, determines at least two fourth vertex coordinates of the vertex, and further determines at least two second vertex coordinates of the vertex based on the mapping relation between the target image and the target map. Then, the terminal performs rasterization processing on at least two second vertex coordinates corresponding to the plurality of vertices, namely, the terminal can obtain positions of the plurality of pixel points in the water surface area after shifting in the target image in a linear interpolation mode. Then, the terminal can acquire target texture information of the plurality of pixel points through the fragment shader, and acquire corresponding target colors based on the target texture information so as to render the water surface area. Then, the terminal displays the rendered water surface area on a screen. The water surface region has a dynamically changeable water wave.
According to the scheme provided by the embodiment of the application, the target texture information of the plurality of vertexes is obtained through the dynamic information of the water surface area and the normal texture mapping in the target image, so that the target texture information can reflect the dynamic change of the plurality of vertexes based on the water flow speed in the water flow time, the water surface area is rendered based on the target texture information, the water surface area can display a dynamic special effect, and the rendering effect is better.
Fig. 7 is a schematic structural view of an image processing apparatus according to an embodiment of the present application. The apparatus is for performing the steps when the above image processing method is performed, see fig. 7, and includes:
an obtaining module 701, configured to obtain first vertex coordinates of a plurality of vertices of a water surface area in a target image, where the vertices are used to represent intersections on a boundary of the water surface area, and the first vertex coordinates are used to represent positions of the vertices in the target image;
a determining module 702, configured to determine target texture information of a plurality of vertices based on dynamic information of a water surface area and a normal texture map, where the dynamic information includes a water flow speed and a water flow time, and the normal texture map is used to represent a correspondence between pixel points in the normal texture map and texture information, and the target texture information is used to represent textures of the vertices;
A rendering module 703, configured to render the water surface area based on the target texture information of the plurality of vertices.
In some embodiments, fig. 8 is a schematic structural diagram of another image processing apparatus provided according to an embodiment of the present application, referring to fig. 8, the determining module 702 includes:
a first determining unit 801, configured to determine, for any vertex, at least two second vertex coordinates of the vertex, based on at least two water flow speeds and water flow times in the dynamic information and first vertex coordinates of the vertex, where the second vertex coordinates are used to represent positions of the vertex after the vertex is deviated in the target image;
a second determining unit 802 for determining target texture information of the vertex based on at least two second vertex coordinates and the normal texture map.
In some embodiments, the first determining unit 801 includes:
a first determining subunit 8011, configured to determine, based on a mapping relationship between the target image and the target map, a third vertex coordinate corresponding to the first vertex coordinate, where the third vertex coordinate is used to represent a position of the vertex in the target map;
a second determining subunit 8012, configured to determine at least two fourth vertex coordinates of the vertex based on the third vertex coordinates, at least two water flow speeds, and water flow time, where the fourth vertex coordinates are used to represent a position of the vertex after the vertex is deviated in the target map;
The first determining subunit 8011 is further configured to determine at least two second vertex coordinates corresponding to the at least two fourth vertex coordinates based on the mapping relationship.
In some embodiments, the second determining subunit 8012 is configured to determine at least two offsets of the vertex based on at least two water flow rates and water flow times; and respectively shifting the third vertex coordinates based on at least two shifting amounts to obtain at least two fourth vertex coordinates.
In some embodiments, the normal texture map includes normal vectors corresponding to the pixel points;
the second determining unit 802 includes:
an obtaining subunit 8021, configured to obtain at least two normal vectors from the normal texture map based on the at least two second vertex coordinates, where the at least two normal vectors are normal vectors of at least two pixel points indicated by the at least two second vertex coordinates;
a third determining subunit 8022 is configured to determine, based on at least two normal vectors, target texture information of the vertex.
In some embodiments, the third determining subunit 8022 is configured to obtain an initial texture coordinate of the vertex, where the initial texture coordinate corresponds to the first vertex coordinate; adding at least two normal vectors to obtain a target normal vector; determining target texture coordinates of the vertex based on the initial texture coordinates of the vertex and the target normal vector; based on the target texture coordinates, target texture information is determined.
In some embodiments, the third determining subunit 8022 is configured to offset the initial texture coordinate based on a modulus of the target normal vector, to obtain the target texture coordinate.
In some embodiments, the rendering module 703 is configured to obtain a plurality of target colors from a color texture map based on target texture information of a plurality of vertices, where the color texture map is used to represent a correspondence between the texture information and the colors; the water surface area is rendered based on the plurality of target colors.
According to the image processing device provided by the application, the target texture information of the plurality of vertexes is obtained through the dynamic information of the water surface area and the normal texture mapping in the target image, so that the target texture information can reflect the dynamic change of the plurality of vertexes based on the water flow speed in the water flow time, the water surface area is rendered based on the target texture information, the water surface area can display a dynamic special effect, and the rendering effect is better.
It should be noted that: the image processing apparatus provided in the above embodiment is exemplified by the above division of each functional module when running an application, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above. In addition, the image processing apparatus and the image processing method provided in the foregoing embodiments belong to the same concept, and detailed implementation procedures of the image processing apparatus and the image processing method are described in the method embodiments, which are not repeated herein.
In the embodiment of the present application, the computer device can be configured as a terminal or a server, when the computer device is configured as a terminal, the technical solution provided by the embodiment of the present application may be implemented by the terminal as an execution body, and when the computer device is configured as a server, the technical solution provided by the embodiment of the present application may be implemented by the server as an execution body, or the technical solution provided by the present application may be implemented by interaction between the terminal and the server, which is not limited by the embodiment of the present application.
Fig. 9 is a block diagram of a terminal 900 according to an embodiment of the present application. The terminal 900 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 900 may also be referred to by other names of object devices, portable terminals, laptop terminals, desktop terminals, etc.
In general, the terminal 900 includes: a processor 901 and a memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 901 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 901 may also include a main processor and a coprocessor, the main processor being a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 901 may integrate a GPU (Graphics Processing Unit, image processor) for taking care of rendering and drawing of content that the display screen needs to display. In some embodiments, the processor 901 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
The memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one computer program for execution by processor 901 to implement the image processing methods provided by the method embodiments of the present application.
In some embodiments, the terminal 900 may further optionally include: a peripheral interface 903, and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 903 via buses, signal lines, or circuit boards. The peripheral device includes: at least one of radio frequency circuitry 904, a display 905, a camera assembly 906, audio circuitry 907, and a power source 908.
The peripheral interface 903 may be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 901, the memory 902, and the peripheral interface 903 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 904 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 904 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. In some embodiments, the radio frequency circuit 904 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, object identity module cards, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication ) related circuits, which the present application is not limited to.
The display 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 905 is a touch display, the display 905 also has the ability to capture touch signals at or above the surface of the display 905. The touch signal may be input as a control signal to the processor 901 for processing. At this time, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one and disposed on the front panel of the terminal 900; in other embodiments, the display 905 may be at least two, respectively disposed on different surfaces of the terminal 900 or in a folded design; in other embodiments, the display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display 905 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 905 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 906 is used to capture images or video. In some embodiments, the camera assembly 906 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of a target object and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be plural and disposed at different portions of the terminal 900. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 907 may also include a headphone jack.
A power supply 908 is used to power the various components in the terminal 900. The power source 908 may be alternating current, direct current, disposable or rechargeable. When the power source 908 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can further include one or more sensors 909. The one or more sensors 909 include, but are not limited to: acceleration sensor 910, gyroscope sensor 911, pressure sensor 912, optical sensor 913, and proximity sensor 914.
The acceleration sensor 910 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the terminal 900. For example, the acceleration sensor 910 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 901 may control the display 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 910. The acceleration sensor 910 may also be used for the acquisition of motion data of a game or a target object.
The gyro sensor 911 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 911 may collect a 3D motion of the target object to the terminal 900 in cooperation with the acceleration sensor 910. The processor 901 may implement the following functions based on the data collected by the gyro sensor 911: motion sensing (such as changing UI according to a tilting operation of a target object), image stabilization at photographing, game control, and inertial navigation.
Pressure sensor 912 may be disposed on a side frame of terminal 900 and/or on an underside of display 905. When the pressure sensor 912 is disposed at a side frame of the terminal 900, a grip signal of the target object to the terminal 900 may be detected, and the processor 901 performs a left-right hand recognition or a shortcut operation according to the grip signal collected by the pressure sensor 912. When the pressure sensor 912 is disposed at the lower layer of the display 905, the processor 901 performs an operation according to the pressure of the target object on the display 905, thereby realizing control of the operability control on the UI interface. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 913 is used to collect the intensity of the ambient light. In one embodiment, the processor 901 may control the display brightness of the display panel 905 based on the intensity of ambient light collected by the optical sensor 913. When the ambient light intensity is high, the display brightness of the display screen 905 is increased; when the ambient light intensity is low, the display luminance of the display panel 905 is turned down. In another embodiment, the processor 901 may also dynamically adjust the shooting parameters of the camera assembly 906 according to the ambient light intensity collected by the optical sensor 913.
A proximity sensor 914, also referred to as a distance sensor, is typically provided on the front panel of the terminal 900. The proximity sensor 914 is used to collect the distance between the target object and the front surface of the terminal 900. In one embodiment, when the proximity sensor 914 detects that the distance between the target object and the front surface of the terminal 900 becomes gradually smaller, the processor 901 controls the display 905 to switch from the bright screen state to the off screen state; when the proximity sensor 914 detects that the distance between the target object and the front surface of the terminal 900 becomes gradually larger, the display 905 is controlled by the processor 901 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 9 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
The embodiment of the application also provides a computer readable storage medium, in which at least one section of computer program is stored, the at least one section of computer program being loaded and executed by a processor of a terminal to implement the operations performed by the terminal in the image processing method of the above embodiment. For example, the computer readable storage medium may be Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), compact disk Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), magnetic tape, floppy disk, optical data storage device, etc.
Embodiments of the present application also provide a computer program product or computer program comprising computer program code stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer readable storage medium, and the processor executes the computer program code so that the terminal performs the image processing methods provided in the above-described various alternative implementations.
It will be appreciated by those of ordinary skill in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing is illustrative of the present application and is not to be construed as limiting thereof, but rather as various modifications, equivalent arrangements, improvements, etc., which fall within the spirit and principles of the present application.

Claims (12)

1. An image processing method, the method comprising:
acquiring first vertex coordinates of a plurality of vertexes of a water surface area in a target image, wherein the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image;
Determining target texture information of the plurality of vertexes based on dynamic information of the water surface area and a normal texture map, wherein the dynamic information comprises water flow speed and water flow time, the normal texture map is used for representing a corresponding relation between pixel points in the normal texture map and texture information, and the target texture information is used for representing textures of the vertexes;
rendering the water surface area based on target texture information of the plurality of vertices.
2. The method of claim 1, wherein the obtaining target texture information for the plurality of vertices based on the dynamic information of the water surface area and a normal texture map comprises:
for any vertex, determining at least two second vertex coordinates of the vertex based on at least two water flow speeds and water flow times in the dynamic information and first vertex coordinates of the vertex, wherein the second vertex coordinates are used for representing positions of the vertex after being deviated in the target image;
and determining target texture information of the vertex based on the at least two second vertex coordinates and the normal texture map.
3. The method of claim 2, wherein the determining at least two second vertex coordinates of the vertex based on at least two water flow rates and water flow times in the dynamic information and the first vertex coordinates of the vertex comprises:
Determining third vertex coordinates corresponding to the first vertex coordinates based on a mapping relation between the target image and a target map, wherein the third vertex coordinates are used for representing the positions of the vertexes in the target map;
determining at least two fourth vertex coordinates of the vertex based on the third vertex coordinates, the at least two water flow speeds and the water flow time, wherein the fourth vertex coordinates are used for representing the position of the vertex after the deviation in the target map;
and determining the at least two second vertex coordinates corresponding to the at least two fourth vertex coordinates based on the mapping relation.
4. A method according to claim 3, wherein said determining at least two fourth vertex coordinates of said vertex based on said third vertex coordinates, said at least two water flow rates, and said water flow time comprises:
determining at least two offsets of the vertex based on the at least two water flow rates and the water flow time;
and respectively shifting the third vertex coordinates based on the at least two shifting amounts to obtain the at least two fourth vertex coordinates.
5. The method of claim 2, wherein the normal texture map comprises normal vectors corresponding to pixels;
The determining target texture information for the vertex based on the at least two second vertex coordinates and the normal texture map includes:
acquiring at least two normal vectors from the normal texture map based on the at least two second vertex coordinates, wherein the at least two normal vectors are normal vectors of at least two pixel points indicated by the at least two second vertex coordinates;
and determining target texture information of the vertex based on the at least two normal vectors.
6. The method of claim 5, wherein determining target texture information for the vertex based on the at least two normal vectors comprises:
acquiring initial texture coordinates of the vertexes, wherein the initial texture coordinates correspond to the first vertex coordinates;
adding the at least two normal vectors to obtain a target normal vector;
determining target texture coordinates of the vertex based on the initial texture coordinates of the vertex and the target normal vector;
and determining the target texture information based on the target texture coordinates.
7. The method of claim 6, wherein the determining the target texture coordinates for the vertex based on the initial texture coordinates for the vertex and the target normal vector comprises:
And shifting the initial texture coordinate based on the model of the target normal vector to obtain the target texture coordinate.
8. The method of any one of claims 1 to 7, wherein the rendering the water surface region based on target texture information for the plurality of vertices comprises:
acquiring a plurality of target colors from a color texture map based on the target texture information of the plurality of vertexes, wherein the color texture map is used for representing the corresponding relation between the texture information and the colors;
rendering the water surface area based on the plurality of target colors.
9. An image processing apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition module, a first vertex coordinate acquisition module and a second vertex coordinate acquisition module, wherein the acquisition module is used for acquiring first vertex coordinates of a plurality of vertexes of a water surface area in a target image, the vertexes are used for representing crossing points on the boundary of the water surface area, and the first vertex coordinates are used for representing positions of the vertexes in the target image;
the determining module is used for determining target texture information of the plurality of vertexes based on dynamic information of the water surface area and a normal texture map, wherein the dynamic information comprises water flow speed and water flow time, the normal texture map is used for representing the corresponding relation between pixel points in the normal texture map and texture information, and the target texture information is used for representing textures of the vertexes;
And the rendering module is used for rendering the water surface area based on the target texture information of the plurality of vertexes.
10. A computer device, characterized in that it comprises a processor and a memory for storing at least one computer program, which is loaded by the processor and which performs the image processing method according to any of claims 1 to 8.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium is adapted to store at least one computer program for executing the image processing method according to any one of claims 1 to 8.
12. A computer program product comprising a computer program which, when executed by a processor, implements the image processing method according to any one of claims 1 to 8.
CN202210288791.7A 2022-03-22 2022-03-22 Image processing method, device, computer equipment and storage medium Pending CN116828207A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210288791.7A CN116828207A (en) 2022-03-22 2022-03-22 Image processing method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210288791.7A CN116828207A (en) 2022-03-22 2022-03-22 Image processing method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116828207A true CN116828207A (en) 2023-09-29

Family

ID=88127977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210288791.7A Pending CN116828207A (en) 2022-03-22 2022-03-22 Image processing method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116828207A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274465A (en) * 2023-11-22 2023-12-22 园测信息科技股份有限公司 Water rendering method, system, medium and equipment matched with real geographic water area environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274465A (en) * 2023-11-22 2023-12-22 园测信息科技股份有限公司 Water rendering method, system, medium and equipment matched with real geographic water area environment
CN117274465B (en) * 2023-11-22 2024-03-08 园测信息科技股份有限公司 Water rendering method, system, medium and equipment matched with real geographic water area environment

Similar Documents

Publication Publication Date Title
US11205282B2 (en) Relocalization method and apparatus in camera pose tracking process and storage medium
CN112870707B (en) Virtual object display method in virtual scene, computer device and storage medium
CN108245893B (en) Method, device and medium for determining posture of virtual object in three-dimensional virtual environment
CN109712224B (en) Virtual scene rendering method and device and intelligent device
CN110097576B (en) Motion information determination method of image feature point, task execution method and equipment
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
CN110064200B (en) Object construction method and device based on virtual environment and readable storage medium
CN111701238A (en) Virtual picture volume display method, device, equipment and storage medium
CN109948581B (en) Image-text rendering method, device, equipment and readable storage medium
CN112907716B (en) Cloud rendering method, device, equipment and storage medium in virtual environment
CN110920631B (en) Method and device for controlling vehicle, electronic equipment and readable storage medium
CN112884873B (en) Method, device, equipment and medium for rendering virtual object in virtual environment
CN110928464B (en) User interface display method, device, equipment and medium
CN111784841B (en) Method, device, electronic equipment and medium for reconstructing three-dimensional image
CN111105474B (en) Font drawing method, font drawing device, computer device and computer readable storage medium
CN112308103B (en) Method and device for generating training samples
CN112750190B (en) Three-dimensional thermodynamic diagram generation method, device, equipment and storage medium
CN116828207A (en) Image processing method, device, computer equipment and storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN112306332A (en) Method, device and equipment for determining selected target and storage medium
CN113209610B (en) Virtual scene picture display method and device, computer equipment and storage medium
CN114764295B (en) Stereoscopic scene switching method, stereoscopic scene switching device, terminal and storage medium
CN110335224B (en) Image processing method, image processing device, computer equipment and storage medium
CN110097619B (en) Animation effect implementation method, device and equipment in application program
CN109388732B (en) Music map generating and displaying method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination