CN117496033A - Mapping processing method and device, computer readable storage medium and electronic device - Google Patents

Mapping processing method and device, computer readable storage medium and electronic device Download PDF

Info

Publication number
CN117496033A
CN117496033A CN202311390758.6A CN202311390758A CN117496033A CN 117496033 A CN117496033 A CN 117496033A CN 202311390758 A CN202311390758 A CN 202311390758A CN 117496033 A CN117496033 A CN 117496033A
Authority
CN
China
Prior art keywords
distance
mapping
preset
target
refraction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311390758.6A
Other languages
Chinese (zh)
Inventor
张孜博
赵进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311390758.6A priority Critical patent/CN117496033A/en
Publication of CN117496033A publication Critical patent/CN117496033A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a mapping processing method and device, a computer readable storage medium and an electronic device, and relates to the technical field of image processing. The method comprises the following steps: determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing; performing offset processing on the first texture coordinates of the initial mapping based on the distortion values to obtain a distortion mapping; determining a target texture coordinate based on the first texture coordinate, the second texture coordinate of the warp map, and the distance information; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane. The method and the device solve the technical problem that the semitransparent refraction effect is incorrect due to the fact that screen space refraction, optical tracking and other technologies are adopted to simulate the refraction effect in the related technology.

Description

Mapping processing method and device, computer readable storage medium and electronic device
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a mapping processing method and apparatus, a computer readable storage medium, and an electronic device.
Background
When game pictures are produced, refraction effects of objects such as glass, water surface and the like are often required to be simulated, so that the sense of reality of a game scene is increased. Currently, a screen space refraction effect carried by a virtual Engine (UE) is generally used when the refraction effect is simulated, but the semitransparent refraction effect is incorrect. In addition, camera Real-Time (RT), cube map (cube map), light-chasing and other technologies are also used for rendering, but the consumption is relatively large.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
At least some embodiments of the present application provide a mapping processing method, apparatus, computer readable storage medium, and electronic device, so as to at least solve a technical problem in the related art that a semitransparent refraction effect is incorrect due to a refraction effect simulated by adopting techniques such as screen space refraction and optical tracking.
According to one embodiment of the present application, there is provided a mapping method, including: determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in a screen space; performing offset processing on a first texture coordinate of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is the mapping to be mapped onto the target object; determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane.
According to one embodiment of the present application, there is also provided a mapping processing apparatus, including: the first determining module is used for determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in a screen space; the offset module is used for performing offset processing on the first texture coordinates of the initial mapping based on the distortion values to obtain a distortion mapping, wherein the initial mapping is a mapping to be mapped onto a target object; the second determining module is used for determining the target texture coordinate based on the first texture coordinate, the second texture coordinate of the distortion map and the distance information, wherein the distance information comprises a first distance and a second distance, the first distance is the distance between the first position of the pixel point of the initial map on the target object and the preset refraction plane, and the second distance is the distance between the second position of the pixel point of the distortion map on the target object and the preset refraction plane; the sampling module is used for carrying out texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is carried out on a target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane.
According to one embodiment of the present application, there is also provided a computer readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the mapping method of the above embodiment when run on a computer or a processor.
According to one embodiment of the present application, there is also provided an electronic device including a memory, in which a computer program is stored, and a processor configured to run the computer program to perform the mapping method in the above embodiment.
In at least some embodiments of the present application, a distortion value is determined based on a preset map, where the preset map is used to simulate a distortion manner of refraction processing, and the distortion value is used to represent a distortion intensity of any pixel of the preset map in a screen space; performing offset processing on a first texture coordinate of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is the mapping to be mapped onto the target object; determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane. The purpose that the refractive intensity is changed according to the distance between an object and a refractive plane, and the refractive intensity is higher when the distance is farther is achieved, so that the technical effects of correctly simulating the semitransparent refractive effect by adopting lower consumption and increasing the rendering sense of reality and details are achieved, and the technical problem that the semitransparent refractive effect is incorrect due to the fact that the refractive effect is simulated by adopting the screen space refraction, light tracking and other technologies in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a schematic image rendered using UE-carried refraction effects;
FIG. 2 is a block diagram of a hardware architecture of a mobile terminal for mapping processing according to an embodiment of the present application;
FIG. 3 is a flow chart of a mapping method according to one embodiment of the present application;
FIG. 4 is a schematic representation of an image rendered by a method according to one embodiment of the present application;
FIG. 5 is a schematic diagram of a positional relationship according to one embodiment of the present application;
FIG. 6 is a block diagram of a mapping processing apparatus according to an alternative embodiment of the present application;
fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For ease of understanding, a description of some of the concepts related to the embodiments of the present application are given by way of example for reference.
The following is shown:
screen space refraction: a technology commonly used in computer graphics is used for simulating refraction effects of objects such as glass, water surface and the like in real time rendering, increasing the reality and details of rendering, and has wide application in the fields of games, movies and the like.
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In one possible implementation, in the field of image processing technology, when rendering a refraction effect, a refraction effect is typically rendered using a screen space refraction effect that is self-contained in the UE.
The inventor has found that the above method still has the problems of refracting the content above the refracting plane, incorrect semitransparent effect of refraction and large consumption after being practiced and studied carefully. As shown in fig. 1, fig. 1 is a schematic diagram of an image rendered by using a refraction effect of a UE, wherein a refraction plane is formed by numerals 1-9, a reference sphere with a fixed position is located behind the refraction plane to provide, as a reference, a refraction intensity of the reference sphere behind the refraction plane, and the target sphere is a sphere with an unfixed position, and can move in front of the refraction plane or move behind the refraction plane or reciprocate in front of and behind the refraction plane through the refraction plane.
It will be appreciated that when the target sphere is in front of the refraction plane, the correct refraction effect should be that no refraction will occur behind the refraction plane, but as can be seen from fig. 1 (a), when the refraction effect carried by the UE is used for rendering, incorrect shadows, i.e. incorrect refraction effects, still occur when the target sphere is in front of the refraction plane. Furthermore, it will be appreciated that when the target sphere is located behind the refractive plane, the correct refractive effect should be that the target sphere is at a different distance from the refractive plane, as well as the refractive strength. However, as can be seen from fig. 1 (B) and (c), the refractive intensity a of the target sphere behind the refractive plane and closer to the refractive plane is the same refractive intensity as the refractive intensity B of the target sphere behind the refractive plane and farther from the refractive plane, that is, when the UE self-contained refractive effect is used for rendering, the distances between the target sphere behind the refractive plane and the refractive plane in (B) and (c) are different, but the degree of distortion of the target sphere in (B) and (c) is the same, so that the refractive effect is incorrect.
Based on this, the application game of the embodiment of the present application may be an image processing field in a game, and a mapping processing method is provided, where a distortion value is determined based on a preset mapping, where the preset mapping is used to simulate a distortion manner of refraction processing, and the distortion value is used to represent a distortion intensity of any pixel of the preset mapping in a screen space; performing offset processing on a first texture coordinate of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is the mapping to be mapped onto the target object; determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane. The purpose that the refractive intensity is changed according to the distance between an object and a refractive plane, and the refractive intensity is higher when the distance is farther is achieved, so that the technical effects of correctly simulating the semitransparent refractive effect by adopting lower consumption and increasing the rendering sense of reality and details are achieved, and the technical problem that the semitransparent refractive effect is incorrect due to the fact that the refractive effect is simulated by adopting the screen space refraction, light tracking and other technologies in the related technology is solved.
The above-described method embodiments referred to in the present application may be performed in a mobile terminal, a computer terminal or similar computing device. Taking the example of running on a mobile terminal, the mobile terminal can be a smart phone, a palm computer, a mobile internet device, a tablet computer (Personal Access Display, PAD), a game machine and other terminal devices. Fig. 2 is a block diagram of a hardware structure of a mobile terminal for mapping processing according to an embodiment of the present application. As shown in fig. 2, the mobile terminal may include one or more (only one is shown in fig. 2) processors 202 (the processors 202 may include, but are not limited to, a central processing unit (central processing unit, CPU), a graphics processor (graphics processing unit, GPU), a digital signal processing (digital signal processing, DSP) chip, a microprocessor (microcontroller unit, MCU), a programmable logic device (field-programmable gate array, FPGA), a neural network processor (neural network processing unit, NPU), a tensor processor (tensor processing unit, TPU), an artificial intelligence (artificial intelligent, AI) type processor, etc.), and a memory 204 for storing data, and in one embodiment of the present application may further include: a transmission device 206, an input output device 208, and a display device 210.
In some optional embodiments, which are based on game scenes, the device may further provide a human-machine interaction interface with a touch-sensitive surface, the human-machine interaction interface may sense finger contacts and/or gestures to interact with a graphical user interface (Graphical User Interface, GUI), the human-machine interaction functions may include the following interactions: executable instructions for performing the above-described human-machine interaction functions, such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music, and/or web browsing, are configured/stored in a computer program product or readable storage medium executable by one or more processors.
It will be appreciated by those skilled in the art that the structure shown in fig. 2 is merely illustrative and not limiting on the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 2, or have a different configuration than shown in fig. 2.
According to one embodiment of the present application, there is provided an embodiment of a mapping processing method, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
In a possible implementation manner, an embodiment of the present application provides a mapping processing method, and fig. 3 is a flowchart of the mapping processing method according to one embodiment of the present application, as shown in fig. 3, and the method includes the following steps:
in step S30, a distortion value is determined based on the preset map.
The preset mapping is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset mapping in a screen space.
The preset mapping is a custom mapping, and is used for determining a distortion standard of the mapping to be processed, namely, a distortion mode for simulating refraction processing, so that the mapping to be processed is distorted in a distortion mode represented by the preset mapping.
The preset map may be a normal map, and since the normal map uses only R channels and G channels, which respectively represent a lateral concave-convex manner and a longitudinal concave-convex manner of the object surface, a distortion criterion of the map to be processed can be defined.
The twist value is referred to as the twist strength and can be understood as the refractive strength. It will be appreciated that each pixel in the preset map corresponds to a warp value that indicates in which direction the pixel is warped, and thus the warp value includes the corresponding warp strength in screen space for each pixel in the preset map.
The way of warping, i.e. the way of refraction, of the map to be processed can be determined by determining the warping value.
In step S32, the first texture coordinates of the initial map are shifted based on the warp value, so as to obtain a warp map.
Wherein the initial map is a map to be mapped onto the target object.
The initial mapping is mapping to be mapped to a target object, and the target object can be understood as an object to be subjected to refraction treatment, namely an object needing to be subjected to refraction through a refraction plane.
The first texture coordinates include texture coordinates for each pixel in the initial map.
And performing offset processing on the first texture coordinates of the corresponding pixels in the initial mapping based on the distortion values, so that the initial mapping can be distorted according to a distortion mode of a preset mapping, and a distorted mapping is obtained.
Step S34, determining the target texture coordinates based on the first texture coordinates, the second texture coordinates of the warp map, and the distance information.
The distance information comprises a first distance and a second distance, wherein the first distance is the distance between a first position of a pixel point of the initial mapping on the target object and a preset refraction plane, and the second distance is the distance between a second position of the pixel point of the distortion mapping on the target object and the preset refraction plane.
The first position may be understood as a position of a pixel of an image obtained after the initial map is mapped to the target object, and the first distance is a distance between the position of the pixel of the image obtained after the initial map is mapped to the target object and a preset refraction plane. Correspondingly, the second position may be understood as a position of a pixel of an image obtained by mapping the distortion map to the target object, and the second distance is a distance between the position of the pixel of the image obtained by mapping the distortion map to the target object and a preset refraction plane.
It will be appreciated that the correct refractive effect is that the refractive intensity varies based on the distance of the object from the refractive plane, the closer the object is to the refractive plane, the smaller the refractive intensity, and conversely the farther the object is from the refractive plane, the greater the refractive intensity. For example, in a terrorist sheet, a ghost runs from a distance to ground glass (refractive plane) and sticks to ground glass, and during this process, the form of the ghost viewed by an observer from the other side of ground glass changes from "very blurred (i.e. high refractive intensity)" to "clear (i.e. low refractive intensity)". For another example, when the window glass is full of rain, a person or car at a distance from the vehicle, a horse or person (i.e., an object far from the refractive plane) may become unclear (i.e., large in refractive intensity) in the vehicle, but the wiper blade (i.e., an object near the refractive plane) is still clearly visible (i.e., small in refractive intensity). Therefore, when the refraction effect is rendered, the relation between the distance between the object and the refraction plane and the refraction intensity needs to be considered, so that the correct refraction effect of the object can be rendered.
In consideration of the relationship between the distance between the target object and the preset refraction plane and the refraction intensity, in the embodiment of the present application, the target texture coordinate is determined based on the first texture coordinate of the initial map, the second texture coordinate of the distortion map, the first distance between the first position of the pixel point of the initial map on the target object and the preset refraction plane, and the second distance between the second position of the pixel point of the distortion map on the target object and the preset refraction plane, so that the obtained target texture coordinate is determined based on the distance between the target object and the preset refraction plane, and further a correct semitransparent refraction effect is achieved.
Step S36, performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping.
The target map is used for simulating a refraction effect obtained after refraction treatment is carried out on the target object, and the distortion strength of the target map is changed based on a third distance between the target object and a preset refraction plane.
The target mapping can be understood as a mapping obtained by simulating the refraction effect of the target object according to the mapping processing method provided by the embodiment of the present application, and because the mapping processing method provided by the embodiment of the present application changes the refraction intensity based on the distance change between the target object and the preset refraction plane, the initial mapping is texture-sampled based on the target texture coordinates, and the obtained target mapping can achieve the technical effects that the closer the distance between the target object and the preset refraction plane is, the smaller the refraction intensity is, the further the distance between the target object and the preset refraction plane is, and the larger the refraction intensity is.
As shown in fig. 4, fig. 4 is a schematic image rendered by the method according to one embodiment of the present application, and the refractive intensities of the refraction plane, the target sphere, and the reference sphere in fig. 1 behind the refraction plane are consistent. It will be appreciated that the image shown in fig. 4 is a rendered representation of the target map mapped onto the target object. The refraction effect when the target sphere is in front of the refraction plane is simulated by the graph (a) in fig. 4, and it can be seen that the incorrect shade of refraction does not appear in the graph (a), i.e. the refraction effect is correct. In fig. 4, the simulation of the graph (b) and the graph (C) is the corresponding refraction effect when the target sphere is located at different distances behind the refraction plane, and it can be seen that the refraction intensity C of the target sphere when the target sphere is behind the refraction plane and is close to the refraction plane is different from the refraction intensity D of the target sphere when the target sphere is behind the refraction plane and is far from the refraction plane, and it can be seen that the refraction intensity of the refraction intensity C is smaller than the refraction intensity of the refraction intensity D, that is, the technical effects that the closer the object is to the refraction plane, the smaller the refraction intensity is, the further the object is from the refraction plane, and the larger the refraction intensity is are achieved.
Through the steps, the distortion value is determined based on the preset map, wherein the preset map is used for simulating the distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in the screen space; performing offset processing on a first texture coordinate of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is the mapping to be mapped onto the target object; determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane. The purpose that the refractive intensity is changed according to the distance between an object and a refractive plane, and the refractive intensity is higher when the distance is farther is achieved, so that the technical effects of correctly simulating the semitransparent refractive effect by adopting lower consumption and increasing the rendering sense of reality and details are achieved, and the technical problem that the semitransparent refractive effect is incorrect due to the fact that the refractive effect is simulated by adopting the screen space refraction, light tracking and other technologies in the related technology is solved.
In a possible implementation manner, in step S30, determining the distortion value based on the preset map may include the following steps:
step S300, obtaining a third texture coordinate of a preset map;
step S301, scaling the third texture coordinates based on a preset scaling ratio to obtain a scaled map;
in step S302, a warp value is determined based on the first channel value of the scaled map, the second channel value of the scaled map, and the predetermined warp ratio.
The preset scaling is a scaling value for texture coordinates, which may be determined based on actual art requirements, and is not limited herein. The first channel value may be an R channel value and the second channel value may be a G channel value. The preset distortion ratio is used to adjust the distortion ratio of the screen space, and may be determined based on actual art requirements, without limitation.
When determining the corresponding distortion value of each pixel based on the preset map, first obtaining the third texture coordinate of the preset map, multiplying the preset scaling ratio determined according to the art requirement by the third texture coordinate, and scaling the preset map to obtain the scaled map. And then obtaining an R channel value and a G channel value of the scaling map, and multiplying the R channel value, the G channel value and a preset distortion proportion, so as to determine a distortion value corresponding to each pixel in the preset map.
In a possible implementation manner, in step S32, performing offset processing on the first texture coordinates of the initial map based on the warp value, to obtain the warp map may include the following steps: and adding the distortion value and the corresponding first texture coordinate to obtain the distortion map.
When the first texture coordinates of the initial map are subjected to offset processing based on the distortion values, the first texture coordinates are subjected to offset processing by adding the distortion values to the corresponding first texture coordinates, and the distortion map is obtained.
In a possible implementation manner, in step S34, determining the target texture coordinate based on the first texture coordinate, the second texture coordinate of the warp map, and the distance information may include performing the steps of:
step S340, determining a target weight value based on the first distance and the second distance.
The target weight value is used for representing weight values corresponding to different distances between the target object and a preset refraction plane.
The first distance is the distance between the first position of the pixel point of the initial mapping on the target object and the preset refraction plane, the second distance is the distance between the second position of the pixel point of the distortion mapping on the target object and the preset refraction plane, and the weight values corresponding to different distances between the object and the refraction plane can be determined based on the first distance and the second distance, so that the refraction intensity corresponding to different distances can be determined based on the target weight values.
Step S341, interpolation processing is performed on the first texture coordinates, the second texture coordinates and the target weight value to obtain target texture coordinates.
And carrying out interpolation processing (such as Lerp processing) on the first texture coordinates of the initial mapping, the second texture coordinates of the distortion mapping and target weight values corresponding to different distances between the target object and the preset refraction plane to obtain a target mapping, so that the refraction intensity can be controlled through the distances. The target map can be smoothly transited between the initial map and the distortion map based on different distances between the target object and the preset refraction plane, so that the closer the target object is to the preset refraction plane, the smaller the refraction intensity is, namely the smaller the distortion degree is, the farther the target object is to the preset refraction plane, the larger the refraction intensity is, namely the larger the distortion degree is, the technical effect is, and the semitransparent refraction effect is accurately simulated.
In a possible implementation manner, in step S340, determining the target weight value based on the first distance and the second distance may include performing the steps of:
step S3400, mapping the first distance to a preset distance range to obtain a first weight value;
step S3401, mapping the second distance to a preset distance range to obtain a second weight value;
And S3402, carrying out fusion processing on the first weight value and the second weight value to obtain a target weight value.
The first weight value is used for representing weight values corresponding to different distances between the first position and a preset refraction plane, and the second weight value is used for representing weight values corresponding to different distances between the second position and the preset refraction plane.
The preset distance range is used for defining a distance range between the pixel point position of the image and a preset refraction plane, namely, the distance range between the pixel point position of the image obtained after mapping the map to the target object and the refraction plane, and the distance range can be determined based on actual art requirements through the distance scaling value representation, and is not limited herein.
It will be appreciated that the technical effect of a smooth transition can be achieved only by ensuring that the distance range between the first position of the pixel point of the initial map on the target object and the preset refraction plane is the same as the distance range between the second position of the pixel point of the warp map on the target object and the preset refraction plane, and therefore the first distance and the second distance need to be mapped (e.g. Remap processed) to the same preset distance range, e.g. to (0,500), so that the technical effect of a smooth transition between the initial map and the warp map based on the difference of distances can be correctly achieved.
When determining the target weight value, it is necessary to determine a first weight value mapping the first distance to a preset distance range and a second weight value mapping the second distance to the preset distance range, and the target weight value is obtained by performing fusion processing (for example, min processing) on the first weight value and the second weight value.
In a possible implementation manner, in step S34, determining the target texture coordinate based on the first texture coordinate, the second texture coordinate of the warp map, and the distance information may include performing the steps of:
step S342, a third distance between the preset refraction plane and the camera, a fourth distance between the first position and the camera and a fifth distance between the second position and the camera are obtained;
step S343, subtracting the fourth distance from the third distance to obtain a first distance;
in step S344, the fifth distance is subtracted from the third distance to obtain the second distance.
As shown in fig. 5, fig. 5 is a schematic diagram of a positional relationship according to an embodiment of the present application, where the three-dimensional coordinate system as shown in fig. 5 can be established including a camera, a refraction plane, and a position of an object in the three-dimensional coordinate system within a screen, and the camera position can be understood as the screen position. It will be appreciated that in a three-dimensional coordinate system (i.e. looking into the screen from the screen position), the object will move in front of the plane of refraction, or behind the plane of refraction, or back and forth across the plane of refraction in front of and behind the plane of refraction.
Thus, when determining the target texture coordinates, by obtaining a third distance between the preset refraction plane and the camera, a fourth distance between the camera and a first position of a pixel point of the initial map on the target object, and a fifth distance between the camera and a second position of a pixel point of the warp map on the target object, the distances between the first position and the second position and the camera can be regarded as the distances between the object and the camera in fig. 5. Subtracting the third distance from the fourth distance to obtain a first distance between the first position and the preset refraction plane, and subtracting the third distance from the fifth distance to obtain a second distance between the second position and the preset refraction plane.
In a possible implementation manner, in step S342, obtaining the third distance may include the following steps:
step S3420, subtracting the position of the camera from the position of the preset refraction plane to obtain a three-dimensional distance;
in step S3421, a third distance is determined based on the three components of the three-dimensional distance.
When the third distance between the preset refraction plane and the camera is obtained, the position of the camera and the position of the preset refraction plane can be subtracted, so that three-dimensional coordinates (a, b, c) are obtained and used for representing the three-dimensional distance. Three components of the three-dimensional coordinates are based on The formula is calculated so that a third distance between the preset refraction plane and the camera can be determined.
In a possible implementation manner, in step S342, acquiring the fourth distance and the fifth distance may include performing the steps of:
step S3422, determining a fourth distance based on the depth information of the first position;
step S3423, determining a fifth distance based on the depth information of the second position.
The depth information of a position can be understood as numerical information of the Z axis of the position in the three-dimensional coordinate system shown in fig. 5.
When the fourth distance between the first position of the pixel point on the target object and the camera of the initial map and the fifth distance between the second position of the pixel point on the target object and the camera of the distortion map are obtained, the fourth distance between the first position and the camera can be determined according to the depth information of the first position in the three-dimensional coordinate system, and the fifth distance between the second position and the camera can be determined according to the depth information of the second position.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method of the embodiments of the present application.
In this embodiment, a mapping processing apparatus is further provided, and this apparatus is used to implement the foregoing embodiments and preferred embodiments, and will not be described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 6 is a block diagram of a mapping apparatus according to an embodiment of the present application, as shown in fig. 6, taking the mapping apparatus 600 as an example, the mapping apparatus 600 includes a first determining module 601, where the first determining module 601 is configured to determine a distortion value based on a preset mapping, and the preset mapping is used to simulate a distortion manner of a refraction process, and the distortion value is used to represent a distortion intensity of any pixel of the preset mapping in a screen space; the offset module 602 is configured to perform offset processing on a first texture coordinate of an initial map based on a warp value to obtain a warp map, where the initial map is a map to be mapped onto a target object; the second determining module 603, where the second determining module 603 is configured to determine the target texture coordinate based on the first texture coordinate, the second texture coordinate of the warp map, and distance information, where the distance information includes a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the warp map on the target object and the preset refraction plane; the sampling module 604 is configured to perform texture sampling on the initial map based on the target texture coordinates to obtain a target map, where the target map is used to simulate a refraction effect obtained by performing refraction processing on a target object, and a distortion strength of the target map is changed based on a third distance between the target object and a preset refraction plane.
Optionally, the second determining module 603 is further configured to: determining a target weight value based on the first distance and the second distance, wherein the target weight value is used for representing weight values corresponding to different distances between a target object and a preset refraction plane; and carrying out interpolation processing on the first texture coordinate, the second texture coordinate and the target weight value to obtain the target texture coordinate.
Optionally, the second determining module 603 is further configured to: mapping the first distance to a preset distance range to obtain a first weight value, wherein the first weight value is used for representing weight values corresponding to different distances between the first position and a preset refraction plane; mapping the second distance to a preset distance range to obtain a second weight value, wherein the second weight value is used for representing weight values corresponding to different distances between the second position and a preset refraction plane; and carrying out fusion processing on the first weight value and the second weight value to obtain a target weight value.
Optionally, the second determining module 603 is further configured to: acquiring a third distance between a preset refraction plane and the camera, a fourth distance between the first position and the camera and a fifth distance between the second position and the camera; subtracting the third distance from the fourth distance to obtain a first distance; and subtracting the third distance from the fifth distance to obtain a second distance.
Optionally, the second determining module 603 is further configured to: subtracting the position of the camera from the position of a preset refraction plane to obtain a three-dimensional distance; a third distance is determined based on the three components of the three-dimensional distance.
Optionally, the second determining module 603 is further configured to: determining a fourth distance based on the depth information of the first location; a fifth distance is determined based on the depth information of the second location.
Optionally, the first determining module 601 is further configured to: obtaining a third texture coordinate of a preset map; scaling the third texture coordinate based on a preset scaling ratio to obtain a scaling map; a warp value is determined based on the first channel value of the scaled map, the second channel value of the scaled map, and a preset warp scale.
Optionally, the offset module 602 is further configured to: and adding the distortion value and the corresponding first texture coordinate to obtain the distortion map.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
Alternatively, in this embodiment, the above-mentioned computer-readable storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
step S30, determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in a screen space;
step S32, performing offset processing on a first texture coordinate of an initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is a mapping to be mapped onto a target object;
step S34, determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane;
Step S36, performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained by carrying out refraction processing on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a target weight value based on the first distance and the second distance, wherein the target weight value is used for representing weight values corresponding to different distances between a target object and a preset refraction plane; and carrying out interpolation processing on the first texture coordinate, the second texture coordinate and the target weight value to obtain the target texture coordinate.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: mapping the first distance to a preset distance range to obtain a first weight value, wherein the first weight value is used for representing weight values corresponding to different distances between the first position and a preset refraction plane; mapping the second distance to a preset distance range to obtain a second weight value, wherein the second weight value is used for representing weight values corresponding to different distances between the second position and a preset refraction plane; and carrying out fusion processing on the first weight value and the second weight value to obtain a target weight value.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: acquiring a third distance between a preset refraction plane and the camera, a fourth distance between the first position and the camera and a fifth distance between the second position and the camera; subtracting the third distance from the fourth distance to obtain a first distance; and subtracting the third distance from the fifth distance to obtain a second distance.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: subtracting the position of the camera from the position of a preset refraction plane to obtain a three-dimensional distance; a third distance is determined based on the three components of the three-dimensional distance.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: determining a fourth distance based on the depth information of the first location; a fifth distance is determined based on the depth information of the second location.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: obtaining a third texture coordinate of a preset map; scaling the third texture coordinate based on a preset scaling ratio to obtain a scaling map; a warp value is determined based on the first channel value of the scaled map, the second channel value of the scaled map, and a preset warp scale.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: and adding the distortion value and the corresponding first texture coordinate to obtain the distortion map.
In the computer readable storage medium of this embodiment, a technical solution of a mapping process is provided, by determining a distortion value based on a preset mapping, where the preset mapping is used to simulate a distortion manner of refraction processing, and the distortion value is used to represent a distortion intensity of any pixel of the preset mapping in a screen space; performing offset processing on a first texture coordinate of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is the mapping to be mapped onto the target object; determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane. The purpose that the refractive intensity is changed according to the distance between an object and a refractive plane, and the refractive intensity is higher when the distance is farther is achieved, so that the technical effects of correctly simulating the semitransparent refractive effect by adopting lower consumption and increasing the rendering sense of reality and details are achieved, and the technical problem that the semitransparent refractive effect is incorrect due to the fact that the refractive effect is simulated by adopting the screen space refraction, light tracking and other technologies in the related technology is solved.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a computer readable storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, a computer-readable storage medium stores thereon a program product capable of implementing the method described above in the present embodiment. In some possible implementations, the various aspects of the embodiments of the present application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the present application as described in the "exemplary methods" section of the embodiments, when the program product is run on the terminal device.
A program product for implementing the above method according to an embodiment of the present application may employ a portable compact disc read Only Memory (CD-ROM) and include a program code, and may be run on a terminal device such as a personal computer. However, the program product of the embodiments of the present application is not limited thereto, and in the embodiments of the present application, the computer-readable storage medium may be any tangible medium that can contain, or store the program for use by or in connection with the instruction execution system, apparatus, or device.
Any combination of one or more computer readable media may be employed by the program product described above. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (EPROM) or flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio Frequency (RF), etc., or any suitable combination of the foregoing.
Embodiments of the present application also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
step S30, determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in a screen space;
step S32, performing offset processing on a first texture coordinate of an initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is a mapping to be mapped onto a target object;
Step S34, determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane;
step S36, performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained by carrying out refraction processing on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a target weight value based on the first distance and the second distance, wherein the target weight value is used for representing weight values corresponding to different distances between a target object and a preset refraction plane; and carrying out interpolation processing on the first texture coordinate, the second texture coordinate and the target weight value to obtain the target texture coordinate.
Optionally, the above processor may be further configured to perform the following steps by a computer program: mapping the first distance to a preset distance range to obtain a first weight value, wherein the first weight value is used for representing weight values corresponding to different distances between the first position and a preset refraction plane; mapping the second distance to a preset distance range to obtain a second weight value, wherein the second weight value is used for representing weight values corresponding to different distances between the second position and a preset refraction plane; and carrying out fusion processing on the first weight value and the second weight value to obtain a target weight value.
Optionally, the above processor may be further configured to perform the following steps by a computer program: acquiring a third distance between a preset refraction plane and the camera, a fourth distance between the first position and the camera and a fifth distance between the second position and the camera; subtracting the third distance from the fourth distance to obtain a first distance; and subtracting the third distance from the fifth distance to obtain a second distance.
Optionally, the above processor may be further configured to perform the following steps by a computer program: subtracting the position of the camera from the position of a preset refraction plane to obtain a three-dimensional distance; a third distance is determined based on the three components of the three-dimensional distance.
Optionally, the above processor may be further configured to perform the following steps by a computer program: determining a fourth distance based on the depth information of the first location; a fifth distance is determined based on the depth information of the second location.
Optionally, the above processor may be further configured to perform the following steps by a computer program: obtaining a third texture coordinate of a preset map; scaling the third texture coordinate based on a preset scaling ratio to obtain a scaling map; a warp value is determined based on the first channel value of the scaled map, the second channel value of the scaled map, and a preset warp scale.
Optionally, the above processor may be further configured to perform the following steps by a computer program: and adding the distortion value and the corresponding first texture coordinate to obtain the distortion map.
In the electronic device of this embodiment, a technical solution of mapping processing is provided, by determining a distortion value based on a preset mapping, where the preset mapping is used to simulate a distortion mode of refraction processing, and the distortion value is used to represent a distortion intensity of any pixel of the preset mapping in a screen space; performing offset processing on a first texture coordinate of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is the mapping to be mapped onto the target object; determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the distortion map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the distortion map on the target object and the preset refraction plane; and performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and a preset refraction plane. The purpose that the refractive intensity is changed according to the distance between an object and a refractive plane, and the refractive intensity is higher when the distance is farther is achieved, so that the technical effects of correctly simulating the semitransparent refractive effect by adopting lower consumption and increasing the rendering sense of reality and details are achieved, and the technical problem that the semitransparent refractive effect is incorrect due to the fact that the refractive effect is simulated by adopting the screen space refraction, light tracking and other technologies in the related technology is solved.
Fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic device 700 is only an example, and should not impose any limitation on the functionality and scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic apparatus 700 is embodied in the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: the at least one processor 710, the at least one memory 720, a bus 730 connecting the various system components including the memory 720 and the processor 710, and a display 740.
Wherein the memory 720 stores program code that can be executed by the processor 710 to cause the processor 710 to perform the steps according to various exemplary implementations of the present application described in the method section above for the embodiments of the present application.
The memory 720 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 7201 and/or cache memory 7202, and may further include Read Only Memory (ROM) 7203, and may also include nonvolatile memory, such as one or more magnetic storage devices, flash memory, or other nonvolatile solid state memory.
In some examples, memory 720 may also include a program/utility 7204 having a set (at least one) of program modules 7205, such program modules 7205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Memory 720 may further include memory located remotely from processor 710, which may be connected to electronic device 700 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 730 may be a bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processor 710, or a local bus using any of a variety of bus architectures.
The display 740 may be, for example, a touch screen type liquid crystal display (Liquid Crystal Display, LCD) that may enable a user to interact with a user interface of the electronic device 700.
Optionally, the electronic apparatus 700 may also communicate with one or more external devices 800 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 700, and/or with any device (e.g., router, modem, etc.) that enables the electronic apparatus 700 to communicate with one or more other computing devices. Such communication may occur through an Input/Output (I/O) interface 750. Also, the electronic device 700 may communicate with one or more networks (e.g., local area network (Local Area Network, LAN), wide area network (Wide Area Network, WAN) and/or public network, such as the internet) via the network adapter 760. As shown in fig. 7, network adapter 760 communicates with other modules of electronic device 700 over bus 730. It should be appreciated that although not shown in fig. 7, other hardware and/or software modules may be used in connection with the electronic device 700, which may include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk array (Redundant Array of Independent Disks, RAID) systems, tape drives, data backup storage systems, and the like.
The electronic device 700 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 7 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the electronic device 700 may also include more or fewer components than shown in fig. 7, or have a different configuration than shown in fig. 1. The memory 720 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to the mapping method in the embodiment of the present application. The processor 710 executes a computer program stored in the memory 720 to perform various functional applications and data processing, i.e., to implement the mapping processing method described above.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (11)

1. A method of mapping, the method comprising:
determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in a screen space;
performing offset processing on a first texture coordinate of an initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is a mapping to be mapped onto a target object;
determining a target texture coordinate based on the first texture coordinate, a second texture coordinate of the warp map and distance information, wherein the distance information comprises a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the warp map on the target object and the preset refraction plane;
And performing texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after the refraction treatment is performed on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and the preset refraction plane.
2. The method of claim 1, wherein the determining target texture coordinates based on the first texture coordinates, the second texture coordinates of the warp map, and distance information comprises:
determining a target weight value based on the first distance and the second distance, wherein the target weight value is used for representing weight values corresponding to different distances between the target object and the preset refraction plane;
and carrying out interpolation processing on the first texture coordinate, the second texture coordinate and the target weight value to obtain the target texture coordinate.
3. The method of claim 2, wherein the determining a target weight value based on the first distance and the second distance comprises:
mapping the first distance to a preset distance range to obtain a first weight value, wherein the first weight value is used for representing weight values corresponding to different distances between the first position and the preset refraction plane;
Mapping the second distance to the preset distance range to obtain a second weight value, wherein the second weight value is used for representing weight values corresponding to different distances between the second position and the preset refraction plane;
and carrying out fusion processing on the first weight value and the second weight value to obtain a target weight value.
4. The method of claim 1, wherein determining the distance information comprises:
acquiring a third distance between the preset refraction plane and the camera, a fourth distance between the first position and the camera and a fifth distance between the second position and the camera;
subtracting the third distance from the fourth distance to obtain the first distance;
and subtracting the third distance from the fifth distance to obtain the second distance.
5. The method of claim 4, wherein obtaining the third distance comprises:
subtracting the position of the camera from the position of the preset refraction plane to obtain a three-dimensional distance;
the third distance is determined based on three components of the three-dimensional distance.
6. The method of claim 4, wherein obtaining the fourth distance and the fifth distance comprises:
Determining the fourth distance based on depth information of the first location;
the fifth distance is determined based on depth information of the second location.
7. The method of claim 1, wherein determining a distortion value based on a preset map comprises:
obtaining a third texture coordinate of the preset map;
scaling the third texture coordinate based on a preset scaling scale to obtain a scaling map;
and determining the distortion value based on the first channel value of the scaling map, the second channel value of the scaling map and a preset distortion ratio.
8. The method of claim 1, wherein the shifting the first texture coordinates of the initial map based on the warp value comprises:
and adding the distortion value and the corresponding first texture coordinate to obtain the distortion map.
9. A map processing apparatus, characterized in that the apparatus comprises:
the first determining module is used for determining a distortion value based on a preset map, wherein the preset map is used for simulating a distortion mode of refraction processing, and the distortion value is used for representing the distortion intensity of any pixel of the preset map in a screen space;
The offset module is used for performing offset processing on the first texture coordinates of the initial mapping based on the distortion value to obtain a distortion mapping, wherein the initial mapping is a mapping to be mapped onto a target object;
a second determining module, configured to determine a target texture coordinate based on the first texture coordinate, a second texture coordinate of the warp map, and distance information, where the distance information includes a first distance and a second distance, the first distance is a distance between a first position of a pixel point of the initial map on the target object and a preset refraction plane, and the second distance is a distance between a second position of the pixel point of the warp map on the target object and the preset refraction plane;
the sampling module is used for carrying out texture sampling on the initial mapping based on the target texture coordinates to obtain a target mapping, wherein the target mapping is used for simulating a refraction effect obtained after refraction processing is carried out on the target object, and the distortion strength of the target mapping is changed based on a third distance between the target object and the preset refraction plane.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program, wherein the computer program is arranged to perform the mapping method as claimed in any of the preceding claims 1 to 8 when run on a computer or processor.
11. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the mapping method as claimed in any of the preceding claims 1 to 8.
CN202311390758.6A 2023-10-23 2023-10-23 Mapping processing method and device, computer readable storage medium and electronic device Pending CN117496033A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311390758.6A CN117496033A (en) 2023-10-23 2023-10-23 Mapping processing method and device, computer readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311390758.6A CN117496033A (en) 2023-10-23 2023-10-23 Mapping processing method and device, computer readable storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN117496033A true CN117496033A (en) 2024-02-02

Family

ID=89668111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311390758.6A Pending CN117496033A (en) 2023-10-23 2023-10-23 Mapping processing method and device, computer readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN117496033A (en)

Similar Documents

Publication Publication Date Title
JP5877219B2 (en) 3D user interface effect on display by using motion characteristics
JP7008733B2 (en) Shadow generation for inserted image content
US20170287196A1 (en) Generating photorealistic sky in computer generated animation
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US10891801B2 (en) Method and system for generating a user-customized computer-generated animation
CN110148224B (en) HUD image display method and device and terminal equipment
KR20230172014A (en) Image processing methods, devices, devices, storage media, program products and programs
CN109147054A (en) Setting method, device, storage medium and the terminal of the 3D model direction of AR
CN117496033A (en) Mapping processing method and device, computer readable storage medium and electronic device
CN115965735A (en) Texture map generation method and device
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
CN115131489A (en) Cloud layer rendering method and device, storage medium and electronic device
US11224801B2 (en) Enhanced split-screen display via augmented reality
CN114742970A (en) Processing method of virtual three-dimensional model, nonvolatile storage medium and electronic device
Wen et al. Post0-vr: Enabling universal realistic rendering for modern vr via exploiting architectural similarity and data sharing
CN114245907A (en) Auto-exposure ray tracing
CN112967369A (en) Light ray display method and device
CN116630509A (en) Image processing method, image processing apparatus, computer-readable storage medium, and electronic apparatus
CN116603235A (en) Information processing method and device for model, readable storage medium and electronic device
US20240015263A1 (en) Methods and apparatus to provide remote telepresence communication
CN117197319B (en) Image generation method, device, electronic equipment and storage medium
Galea et al. Gpu-based selective sparse sampling for interactive high-fidelity rendering
CN116889723A (en) Picture generation method and device of virtual scene, storage medium and electronic device
CN115120973A (en) Model rendering method and device, nonvolatile storage medium and terminal equipment
CN116597064A (en) Illumination information processing method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination