CN114627231A - Method and device for determining transparency, electronic equipment and storage medium - Google Patents

Method and device for determining transparency, electronic equipment and storage medium Download PDF

Info

Publication number
CN114627231A
CN114627231A CN202011444046.4A CN202011444046A CN114627231A CN 114627231 A CN114627231 A CN 114627231A CN 202011444046 A CN202011444046 A CN 202011444046A CN 114627231 A CN114627231 A CN 114627231A
Authority
CN
China
Prior art keywords
target detection
detection point
determining
model
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011444046.4A
Other languages
Chinese (zh)
Inventor
冯乐乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202011444046.4A priority Critical patent/CN114627231A/en
Priority to PCT/CN2021/131497 priority patent/WO2022121652A1/en
Publication of CN114627231A publication Critical patent/CN114627231A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a device for determining transparency, electronic equipment and a storage medium, wherein the method comprises the following steps: respectively determining a distance function of spherical distribution of each target detection point on the first submodel; processing each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point; and for each target detection point, storing a projection coefficient value into the vertex color and/or attribute information corresponding to the current target detection point, importing the projection coefficient value into a target position in an engine according to the index coordinate of the vertex, determining target distance information between each target detection point and the second model at a target shooting angle according to data reconstructed based on the vertex color and/or attribute information stored in the target position, and determining a transparency parameter based on the target distance information to display the first sub-model and the second sub-model. The technical scheme of the embodiment realizes that the transparent display effect accords with the reality, thereby improving the technical effect of user experience.

Description

Method and device for determining transparency, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to a game development technology, in particular to a method and a device for determining transparency, electronic equipment and a storage medium.
Background
In animation, a semi-transparent effect between the inner model and the outer model, for example, a semi-transparent display of the skin model and the clothes model, is generally set. Accordingly, the semi-transparent display mainly depends on the display effect of the light reflected by the inner object after passing through the outer object after a certain distance and being incident on human eyes. Each model is composed of a point, when the distance between a point on the inner layer object and a corresponding point on the outer layer object is longer, the semitransparent effect is weaker, and otherwise, the semitransparent effect is stronger.
Currently, the determination of the semitransparent effect is usually to set a transparency display value of the outer layer model, the transparency display value is usually fixed, and all points on the outer layer model realize transparent display according to the set transparency display value. The method has the technical problems that certain deviation exists between the transparent display and the actual situation, so that the transparent display effect is poor and the user experience is poor.
Disclosure of Invention
The invention provides a method and a device for determining transparency, electronic equipment and a storage medium, which are used for realizing that a transparent display effect is consistent with the reality, thereby improving the technical effect of user experience.
In a first aspect, an embodiment of the present invention provides a method for determining transparency, where the method includes:
respectively determining a distance function of spherical distribution of each target detection point on the first submodel, wherein the distance function is determined according to distance information between the target detection point and the second submodel in each direction; the first sub-model is a model wrapping the second sub-model;
processing each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basic functions;
and for each target detection point, storing a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point, importing the projection coefficient value into a target position in an engine according to index coordinates of a vertex, determining target distance information between each target detection point and the second model under a target shooting angle according to data reconstructed based on the vertex color and/or the attribute information stored in the target position, determining transparency parameters of each target detection point based on the target distance information, and displaying the first sub model and the second sub model based on each transparency parameter.
In a second aspect, an embodiment of the present invention further provides a method for determining transparency, where the method includes:
determining a target shooting angle corresponding to each target detection point on the shooting device and the first submodel;
for each target detection point, a projection coefficient value corresponding to the current target detection point is called from the vertex color and/or attribute information corresponding to the current target detection point, and distance information between a first sub-model and a second sub-model corresponding to the current target detection point is determined according to the projection coefficient value and a target shooting angle; the first sub-model is a model wrapping the second sub-model; the projection coefficient value stored in the vertex color and/or attribute information is determined after the distance function of the spherical distribution of each target detection point on the first sub-model is processed on the basis of the spherical harmonic function;
and determining the transparency parameter of each target detection point according to the distance information corresponding to each target detection point, and displaying the first submodel and the second submodel based on the transparency parameter.
In a third aspect, an embodiment of the present invention further provides an apparatus for determining transparency, where the apparatus includes:
the distance information determining module is used for respectively determining a distance function of the spherical distribution of each target detection point on the first submodel, and the distance function is determined according to the distance information between the target detection points and the second submodel in each direction; the first sub-model is a model wrapping the second sub-model;
the projection coefficient value determining module is used for processing each distance function based on a preset spherical harmonic function to obtain the projection coefficient value of each target detection point on each basis function; the spherical harmonic function is composed of a plurality of basic functions;
and the projection coefficient value storage module is used for storing a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point for each target detection point, importing the target position in an engine according to the index coordinate of the vertex, determining target distance information between each target detection point and the second model under a target shooting angle by using data reconstructed based on the vertex color and/or the attribute information stored in the target position, determining transparency parameters of each target detection point based on the target distance information, and displaying the first sub model and the second sub model based on each transparency parameter.
In a fourth aspect, an embodiment of the present invention further provides an apparatus for determining transparency, where the apparatus includes:
the target shooting angle determining module is used for determining a target shooting angle corresponding to each target detection point on the shooting device and the first sub-model;
the distance information determining module is used for calling a projection coefficient value corresponding to the current target detection point from the vertex color and/or attribute information of the current target detection point aiming at each target detection point, and determining the distance information between a first sub-model and a second sub-model corresponding to the current target detection point according to the projection coefficient value and a target shooting angle; the first sub-model is a model wrapping the second sub-model; the projection coefficient value stored in the vertex color and/or attribute information is determined after the distance function of the spherical distribution of each target detection point on the first sub-model is processed on the basis of the spherical harmonic function;
and the transparent display module is used for determining the transparency parameter of each target detection point according to the distance information corresponding to each target detection point and displaying the first submodel and the second submodel based on the transparency parameter.
In a fifth aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method for determining transparency according to any one of the embodiments of the present invention.
In a sixth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are used to perform the method for determining transparency according to any one of the embodiments of the present invention.
According to the technical scheme of the embodiment of the invention, the distance function of the spherical distribution of each target detection point on the first sub-model is determined, the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and is stored into the vertex color and/or attribute information, so that the transparency parameter under the target shooting angle is determined and displayed according to the stored information and the reconstruction function, the technical problems that the transparency display effect is poor and the user experience is poor due to the fact that the transparency display deviates from the actual situation when the first sub-model is transparently displayed by the fixed transparency display value are solved, the transparency parameter is adjusted according to the actual situation, the transparency display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic flow chart illustrating a method for determining transparency according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating a method for determining transparency according to a second embodiment of the present invention;
fig. 3 is a flowchart illustrating a method for determining transparency according to a third embodiment of the present invention;
fig. 4 is a flowchart illustrating a method for determining transparency according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an apparatus for determining transparency according to a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an apparatus for determining transparency according to a sixth embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to a seventh embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a method for determining transparency according to an embodiment of the present invention, where the embodiment is applicable to a situation where transparency parameters are adjusted according to an actual situation so that a transparent display effect matches an actual theoretical effect, and the method may be executed by a device for determining transparency, where the device may be implemented in a form of software and/or hardware, where the hardware may be an electronic device, and optionally, the electronic device may be a mobile terminal, and the like.
As shown in fig. 1, this embodiment specifically includes the following steps:
and S110, respectively determining the distance function of the spherical distribution of each target detection point on the first submodel.
The first sub-model and the second sub-model are relative, and if the application scene is a skin model and a clothes model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model. The target detection point can be a preset detection point on the first sub-model; or the preset on the first sub-model is divided into a plurality of blocks, and the central point of each block can be used as a target detection point; or a detection point set by a developer according to actual requirements; of course, the first submodel may be configured by a plurality of points, and each point may be set as a target detection point. The distance function is corresponding to each target detection point, i.e. one target detection point corresponds to one distance function. For each distance function, it can be understood that: can be determined according to the relative distance information between a certain target detection point and the second submodel in all directions in the space. Because the distance function is a distance function of spherical distribution, the distance between each target detection point in each direction and the second submodel can be: and taking the target detection point as a sphere center, emitting physical rays to each direction in the space, and determining the distance information between the target detection point and the intersection point of each physical ray and the second sub-model. For example, if there are 1000 target detection points, the number of the distance functions is 1000, where each distance function may be a distance function determined based on a distance between a certain target detection point and the second sub-model in each direction.
It should be noted that, the spherical distribution distance function for determining each target detection point is determined in the same manner, and for clarity of describing the technical solution of the present embodiment, the spherical distribution function for determining one of the target detection points is taken as an example for description.
Specifically, a target detection point on the first sub-model may be used as a sphere center, physical rays may be emitted in each direction in space, and distance information between the target detection point and an intersection point of each physical ray and the second sub-model may be determined. And if the physical ray has an intersection point with the second sub-model in the direction of the ray, taking the distance between the intersection point and the target detection point as the distance information between the target detection point and the second sub-model in the direction. If there is no intersection point between the physical ray and the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in the direction.
For example, the information of the target detection point corresponding to the physical ray a is (am1, am2, am3), and if the physical ray a has an intersection point with the second sub-model in the direction of the ray and the coordinate information of the intersection point is (ap1, ap2, ap3), it may be determined that the distance information between the first sub-model and the second sub-model corresponding to the current target detection point is (am1, am2, am3)
Figure BDA0002823621270000071
If there is no intersection point between the physical ray a and the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in the direction, for example: l ═ 10 nm. Further, according to each target detection point, the distance information between the corresponding first sub-model and the corresponding second sub-model can be determined.
From the determined distance information of the target detection points in all directions in space, a distance function of the spherical distribution of the target detection points can be determined, for example: the spherical distribution distance function of the target detection point A is
Figure BDA0002823621270000072
Where i denotes the ith direction, f (i) denotes distance information in the ith direction, dist _ i denotes specific distance information, and n denotes the total number of directions.
It should be noted that the distance function of the spherical distribution of each target detection point is a complex function, the number of sub-functions in the complex function may be determined according to a preset number of samples, and the default precision may be 16 × 32, that is, the complex function includes 512 sub-functions. The greater the number of samples, the greater the number of sub-functions in the composite function, and the higher the accuracy. The specific sampling number can be determined according to actual requirements.
And S120, processing each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point.
It should be noted that, the projection coefficient values of each target detection point are determined in the same manner, and for the sake of clarity, the technical solution of the present embodiment is described by taking the determination of the projection coefficient value of one of the target detection points as an example.
The spherical harmonic function is composed of a plurality of basis functions, and each distance function can be processed by using the basis functions. The number of the basis functions corresponding to different orders of the spherical harmonic function is different, the second order spherical harmonic function comprises 4 basis functions, the third order spherical harmonic function comprises 9 basis functions, and the fourth order spherical harmonic function comprises 16 basis functions. The projection coefficient value is obtained by processing the distance function according to the basis functions in the spherical harmonic function, and the number of the coefficient value is the same as that of the basis functions. And compressing the spherical distribution distance function of the target detection point according to the spherical harmonic function and a plurality of corresponding basis functions to obtain a projection coefficient value. The following are exemplary: the spherical harmonic function is second order, that is, 4 basis functions are included, then 4 projection coefficient values can be obtained by inputting the distance function corresponding to the target detection point into the spherical harmonic function, and the effect of compressing the distance function into 4 projection coefficient values is realized.
It should be noted that the higher the order of the spherical harmonic function is, the higher the similarity between the reconstructed sphere and the actual sphere is, so that a developer can select the spherical harmonic function with different orders according to actual requirements. The higher order spherical harmonics contain more basis functions, and the degree of reduction in the subsequent reduction of the distance function according to the spherical harmonics and the projection coefficient values is higher.
Specifically, after the specific order of the spherical harmonic function is determined, the distance function of the spherical distribution of the target detection points may be input into the spherical harmonic function, and the distance function is processed based on the basis of the basis functions in the spherical harmonic function, so as to obtain the projection coefficient values of the distance function on each basis function. It follows that the number of projection coefficient values is equal to the number of basis functions in the spherical harmonics.
It should be noted that the distance functions of the spherical distribution of each target detection point may be the same or different, when the distance information of different target detection points in each direction in the space is the same, the distance functions corresponding to different target detection points are the same, and when at least one of the distance information of different target detection points in each direction in the space is different, the distance functions corresponding to different target detection points are different, where whether the distance functions are the same is determined according to the distance information of the target detection points in each direction in the space. And inputting different distance functions into preset spherical harmonic functions for processing, wherein the obtained projection coefficient values are different.
S130, aiming at each target detection point, storing a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point, leading the projection coefficient value into a target position in an engine according to index coordinates of a vertex, determining target distance information between each target detection point and a second model under a target shooting angle according to data reconstructed based on the vertex color and/or the attribute information stored in the target position, determining a transparency parameter of each target detection point based on the target distance information, and displaying a first sub model and a second sub model based on each transparency parameter.
Each target detection point on the first sub-model may be used as a vertex color, and each target detection point corresponds to a pixel channel, that is, four channels RGBA, and the projection coefficient value of the target detection point may be stored in each channel. The attribute information may be extensible information corresponding to each target detection point, for example: UV, i.e. u, v texture map coordinates. The vertex color corresponding to the target detection point and the attribute information of the current target detection point can be used together to store the projection coefficient value.
Specifically, when the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point, 4 projection coefficient values may be stored for each of the four channels R, G, B, and a, and the number of vertex colors required may be determined according to the number of projection coefficient values. It is also possible to store 4 projection coefficient values in one vertex color and store the remaining projection coefficient values into the UV coordinates corresponding to the current target detection point, which can reduce the use of vertex colors and facilitate the storage of projection coefficient values and the subsequent retrieval and use.
Optionally, the vertex color corresponding to the target detection point and the UV coordinate corresponding to the current target detection point may be used together to store the projection coefficient value, for example: if 9 projection coefficient values are stored, 8 projection coefficient values can be stored using two vertex colors, and the remaining 1 projection coefficient value is stored in the UV coordinate corresponding to the current target detection point.
After the projection coefficient values are stored in the vertex color and/or attribute information, the projection coefficient values can be imported into a target position in the engine according to the index coordinates of the vertex for storage, so that the projection coefficient values corresponding to the current target detection points are called from the engine according to the target detection points, the transparency parameters of all the target detection points are determined according to the target distance information after the projection coefficient values are reconstructed, and the first sub-model and the second sub-model are displayed based on all the transparency parameters.
According to the technical scheme of the embodiment, the distance function of the spherical distribution of each target detection point on the first sub-model is determined, the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and is stored into the vertex color and/or the attribute information, so that the transparency parameter at the target shooting angle is determined and displayed according to the stored information and the reconstruction function, the technical problems that the transparency display effect is poor and the user experience is poor due to the fact that the transparency display deviates from the actual situation when the first sub-model is transparently displayed by the fixed transparency display value are solved, the transparency parameter is adjusted according to the actual situation, the transparency display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
Example two
Fig. 2 is a flowchart illustrating a method for determining transparency according to a second embodiment of the present invention, and reference may be made to the technical solution of the present embodiment for determining a specific determination manner of a distance function and a specific storage manner of storing a projection coefficient value of each target detection point to a vertex color based on the second embodiment. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 2, the method for determining transparency according to the present embodiment includes:
s210, determining information of collision points to be processed when the current target detection point transmits physical rays to all directions to penetrate through the second submodel for all target detection points on the first submodel.
The information of each collision point to be processed may include position information of each collision point to be processed, for example: spatial coordinate information, etc.
Specifically, physical rays may be emitted in various directions based on target detection points on the first submodel, the physical rays may possibly pass through the second submodel, when the physical rays pass through the second submodel, it is determined that a collision point to be processed exists between the physical rays and the second submodel, and an intersection point of the physical rays and the second submodel is the collision point to be processed. The collision point information to be processed may be information for describing the position of the point, such as spatial coordinate information. Therefore, the spatial coordinate information corresponding to the collision point to be processed can be determined as the collision point information to be processed according to the collision point to be processed.
In this embodiment, determining collision point information to be processed may be: and the current target detection point can be used as a sphere center, physical rays are emitted to any direction in space, and the information of collision points to be processed when each physical ray passes through the second sub-model is determined.
Specifically, it may be considered that the physical rays emitted from the target detection points on the first sub-model in all directions are emitted in all directions of the spherical surface with the current target detection point as the center of the sphere. And if the physical ray passes through the second submodel, taking the space coordinate information of the intersection point of the physical ray and the second submodel as the collision point information to be processed.
And S220, determining distance information between the current target detection point and the second submodel in each direction according to the information of the current target detection point and each collision point to be processed.
And when the physical ray and the second sub-model have collision points to be processed, determining the distance information between the collision point information to be processed and the current target detection point.
Specifically, when there is a collision point to be processed, the distance information between the collision point to be processed and the current target detection point can be calculated by using a calculation formula of the distance between two points in space according to the space coordinate information of the current target detection point and the space coordinate information of the collision point to be processed.
And when the physical ray and the second sub-model have no collision point to be processed, setting the distance information corresponding to the collision point to be processed as a set value.
Specifically, if the physical ray does not penetrate through the second sub-model, for example, the emitted physical ray is parallel to the second sub-model or the physical ray is emitted in a direction away from the second sub-model, and there is no collision point between the physical ray and the second sub-model, the collision information to be processed at this time may be set as the set value. The set value may be maximum distance information between the collision point information to be processed and the current target detection point.
And determining the distance information between the current target detection point and the second submodel in each direction according to the distance information or the set value corresponding to each collision point to be processed.
Specifically, based on the two situations, the distance information or the set value corresponding to each physical ray emitted from the current target detection point may be determined, and the distance information or the set value may be used as the distance information between the current target detection point and the second sub-model in each direction.
And S230, determining the distance function of the spherical distribution of the corresponding target detection points according to the distance information of each target detection point in each direction.
Specifically, the distance information of each target detection point in each direction in the space is taken as a sub-function in the distance function of the spherical distribution of the target detection points, so that the distance function of the spherical distribution of the target detection points can be obtained. The number of sub-functions in the distance function is the same as the number of distance information of the target detection points in all directions of the spherical surface.
It should be noted that, in order to improve the accuracy, the number of sub-functions may be increased, that is, the density of the physical rays may be increased, and the number of specific physical rays may be determined according to actual requirements.
S240, determining the order of the spherical harmonic function, and determining the representation mode of the basis functions in the spherical harmonic function and the number of the basis functions according to the order.
The spherical harmonics of different orders contain different numbers of basis functions, for example: the second order spherical harmonic contains 4 basis functions, the third order spherical harmonic contains 9 basis functions, the fourth order spherical harmonic contains 16 basis functions, and the like. The higher the order of the spherical harmonic function is, the better the effect when the reconstruction function is used for reconstruction in the subsequent process is, and the specific order needs to be set according to actual requirements.
Specifically, if the order of the spherical harmonic function is determined to be a according to the requirement, the number of basis functions in the spherical harmonic function can be determined to be a2. The representation of each basis function can be determined from the relationship of the distance function to the projection coefficient values.
And S250, processing the distance function of the current target detection point based on each basis function aiming at each target detection point to obtain the projection coefficient value of the current target detection point.
It should be noted that, the projection coefficient values of each target detection point are determined in the same manner, and for the sake of clarity, the technical solution of the present embodiment is described by taking the determination of the projection coefficient value of one of the target detection points as an example.
Wherein the number of projection coefficient values is the same as the number of basis functions. The projection coefficient value is a value determined by calculation using each basis function in the spherical harmonics set in advance for the distance function.
Specifically, processing the distance function of the target detection point based on each basis function can obtain the projection coefficient value corresponding to the basis function, and therefore, the number of projection coefficient values is the same as the number of basis functions. The distance function of the target detection point is input into each basis function of the spherical harmonic function, and the projection coefficient value of the distance function on each basis function can be obtained.
It should be noted that the distance functions of the spherical distribution of each target detection point are different, and after the different distance functions are input to the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
And S260, storing the projection coefficient value corresponding to the current target detection point into the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point.
In order to store the projection coefficient value corresponding to the current target detection point, the projection coefficient value can be selected to be stored in the vertex color corresponding to the target detection point and/or the attribute information of the current target detection point, so as to be convenient for subsequent retrieval and use.
Optionally, determining a target number of projection coefficient values corresponding to the current target detection points; determining the number of vertex colors corresponding to the current target detection point based on the number of targets and the storage number corresponding to the vertex colors; and storing the projection coefficient value into the vertex color corresponding to the current target detection point.
Wherein, the target number is the number of projection coefficient values and is also the number of basis functions in the preset spherical harmonic function. The storage number is the number of projection coefficient values that can be stored per vertex color, for example: the vertex color contains four channels RGBA, with a storage number of 4. The number of vertex colors is the number of vertex colors used to store the projection coefficient values.
Specifically, the vertex color may be stored in the RGBA channel, i.e., there are 4 channels, or may be stored in the RGB channel, i.e., there are 3 channels. Taking the example of using RGBA channel storage for vertex color, if the preset spherical harmonic is a second-order spherical harmonic, and includes 4 basis functions, then 4 projection coefficient values can be obtained, and then the 4 projection coefficient values are stored into one vertex color corresponding to the current target detection point. If the preset spherical harmonic function is a fourth-order spherical harmonic function and comprises 16 basis functions, then 16 projection coefficient values can be obtained, and the 16 projection coefficient values are stored into four vertex colors corresponding to the current target detection point, wherein the four vertex colors belong to different pictures and correspond to the current target detection point.
It should be noted that, since each vertex color can store 4 projection coefficient values, if the number of projection coefficient values is not a multiple of 4, the number of vertex colors needs to be rounded, for example: the third order spherical harmonics, which contains 9 basis functions, corresponds to 9 projection coefficient values, so that two vertex colors can be used to store 8 projection coefficient values, and the remaining 1 projection coefficient value still needs one vertex color to be stored, thereby totally requiring 3 vertex colors.
The following can also be used: after storing partial projection coefficient values using one vertex color, the remaining projection coefficient values are stored using attribute information of the vertex color.
Optionally, determining a target number of projection coefficient values corresponding to the current target detection points; determining the preset number of the storage projection coefficient values according to the vertex color of the current target detection point; and storing the residual projection coefficient values into the attribute information of the vertex colors according to the target number and the preset number.
Wherein the preset number is the number of projection coefficient values that the vertex color can store.
Specifically, the residual projection coefficient value may be determined according to the difference between the target number and the preset number, and the residual projection coefficient value may be stored in the attribute information of the vertex color. For example: each vertex color may store 4 projection coefficient values, while a third order spherical harmonic contains 9 basis functions, corresponding to 9 projection coefficient values. One vertex color corresponding to a target detection point may store 4 projection coefficient values and 5 remaining projection coefficient values into the UV coordinates corresponding to the target detection point.
It is also possible to store 8 projection coefficient values using two vertex colors corresponding to a target detection point, and store 1 remaining projection coefficient value into the UV coordinate corresponding to the target detection point. The storage mode can reduce the use purpose of the vertex color.
And S270, importing the vertex color and/or the attribute information into a target position in the engine according to the coordinate information for storage.
The engine can be a core component of a written editable computer game system or some interactive real-time image application programs. The target location may be a memory space in the engine for storing data and/or information, in this embodiment a memory space for storing coordinate information of the target detection point.
Specifically, after the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point, in order to enable the engine to retrieve and use the stored projection coefficient value, the vertex color and/or the attribute information needs to be imported into a target position in the engine and stored according to the coordinate information of the target detection point. If the engine needs to use the projection coefficient value corresponding to a certain target detection point, the projection coefficient value corresponding to the coordinate information can be determined according to the target position of the coordinate information of the target detection point in the engine for subsequent reconstruction.
According to the technical scheme of the embodiment of the invention, the distance function of the spherical distribution of each target detection point on the first sub-model is determined, the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and is stored into the vertex color and/or attribute information, so that the transparency parameter under the target shooting angle is determined and displayed according to the stored information and the reconstruction function, the technical problems that the transparency display effect is poor and the user experience is poor due to the fact that the transparency display deviates from the actual situation when the first sub-model is transparently displayed by the fixed transparency display value are solved, the transparency parameter is adjusted according to the actual situation, the transparency display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
EXAMPLE III
Fig. 3 is a flowchart of a method for determining transparency according to a third embodiment of the present invention, where this embodiment is applicable to a situation where distance information between a target detection point and a second sub-model at each angle can be determined by reconstructing from projection coefficient values in vertex color and/or attribute information, and performing transparent display according to the distance information, and the method may be implemented by an apparatus for determining transparency, where the apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, and optionally, the electronic device may be a mobile terminal, and the like. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
As shown in fig. 3, this embodiment specifically includes the following steps:
s310, determining a target shooting angle corresponding to each target detection point on the shooting device and the first sub-model.
The shooting device is used for observing and shooting the first sub-model, and the target shooting angle is a relative angle between the shooting device and each target detection point on the first sub-model.
Specifically, the shooting device and the target shooting angle corresponding to each target detection point are different, and the relative angle information between the shooting device and each target detection point can be respectively determined according to the relative position relationship between the shooting device and each target detection point on the first submodel, and the angle information can be used as the target shooting angle.
S320, aiming at each target detection point, a projection coefficient value corresponding to the current target detection point is called from the vertex color and/or attribute information corresponding to the current target detection point, and distance information between a first sub-model and a second sub-model corresponding to the current target detection point is determined according to the projection coefficient value and the target shooting angle.
The first sub-model and the second sub-model are relative, and if the application scene is a skin model and a clothes model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model. Each target detection point corresponds to one or more vertex colors and/or attribute information, and for one target detection point, it is explained that the vertex colors and/or attribute information in a certain target detection point includes a projection coefficient value determined after processing a distance function of spherical distribution of the target detection point on the first sub-model based on a spherical harmonic function.
Specifically, when each target detection point on the first submodel needs to be transparently displayed, the projection coefficient value of the current target detection point may be acquired from the vertex color and/or attribute information corresponding to the current target detection point. And simulating a distance function comprising corresponding distance information under each angle according to the stored projection coefficient values. For example, according to the projection coefficient value, the distance value corresponding to the second sub-model in each direction when the physical ray is emitted to each direction of the space with the current target detection point as the sphere center can be simulated. According to the current target detection point and the shooting device, optionally, the position of a camera device or a pupil of a human eye in the display screen, a target shooting angle between the shooting device and the current target detection point can be determined, and based on the target shooting angle, a collision point on a second submodel corresponding to the current target detection point on the second submodel and distance information between the collision point and the current target detection point can be determined.
It should be noted that, for each target detection point, the target shooting angle between the target detection point and the shooting device may be determined in the above manner, and the distance information between each target detection point and the collision point in the second sub-model may be determined based on the target shooting angle.
For example, when the current target detection point on the first submodel is transparently displayed, 2 vertex colors and 1 attribute information corresponding to the current target detection point may be determined according to the current target detection point, and since 1 projection coefficient value is stored in each of the four RGBA channels of each vertex color, 9 projection coefficient values corresponding to the current target detection point may be obtained. The 9 projection coefficient values are processed according to a preset spherical harmonic function, so that 9 pieces of distance information can be determined. The 9 pieces of distance information are corresponding distance information of the target detection point under 9 angles in space, and a reconstruction function can be constructed according to the distance information. Further, inputting a target shooting angle, such as 45 °, into the constructed reconstruction function, can obtain distance information, such as 7nm, corresponding to the target shooting angle.
S330, determining the transparency parameter of each target detection point according to the distance information corresponding to each target detection point, and displaying the first sub-model and the second sub-model based on the transparency parameter.
The transparency parameter is used to represent the transparency of the model when displaying, and may be represented by a percentage, for example: transparency of 80%, etc.
It should be noted that, the transparency parameter of each target detection point is determined in the same manner, and for clarity, the transparency parameter of one of the target detection points is determined as an example to describe the technical solution of the present embodiment.
Specifically, according to the distance information between the first submodel and the second submodel corresponding to the current target detection point, determining the transparency parameter of the current target detection point may be: the transparency parameter is determined in the corresponding relation between the distance information and the transparency parameter which are stored in advance according to the distance information, or the calculation can be carried out according to a preset transparency parameter calculation model, the distance information is input into the transparency parameter calculation model, and the transparency parameter corresponding to the distance information can be obtained through calculation.
Illustratively, the distance information is denoted as dist, and if 0nm < dist ≦ 5nm, the transparency parameter is 90%, 5nm < dist ≦ 10nm, the transparency parameter is 80%, and when dist ≦ 7.5nm, the corresponding transparency parameter is 80%.
Illustratively, the transparency calculation formula is ai=f(li) Wherein a isiDenotes the ith target detection point, liAnd f is a monotonicity and monotonically decreasing function.
After the transparency parameters of the target detection points are determined, in order to enable the visual experience effect of the first sub-model and the second sub-model to be better, the target detection points on the first sub-model can be displayed according to the corresponding transparency parameters, and intersection points of the target detection points and the straight line to which the target shooting device belongs and the second sub-model are displayed according to the corresponding transparency parameters, so that the transparent display effect is obtained.
And determining the transparency parameter of each target detection point according to each target detection point, so that the effect of performing transparent display on the first sub-model and the second sub-model can be realized.
According to the technical scheme, the transparency parameter is determined and displayed through the determined target shooting angle and the distance information between the first sub-model and the second sub-model corresponding to the target detection point and the projection coefficient value is called from the vertex color and/or the attribute information, so that the technical problems that the transparent display effect is poor and the user experience is poor due to the fact that the transparent display deviates from the actual situation when the first sub-model is transparently displayed by the fixed transparency display value are solved, the transparency parameter is adjusted according to the actual situation, the transparent display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
Example four
Fig. 4 is a flowchart illustrating a method for determining transparency according to a fourth embodiment of the present invention, where on the basis of the fourth embodiment, a reconstruction function may be reconstructed according to a spherical harmonic function and a projection coefficient value, and a transparency parameter corresponding to a target detection point at a relative shooting angle between the target detection point and a shooting device may be determined based on the reconstruction function, and a specific implementation manner of the method may be described in detail below. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 4, the method for determining transparency according to the present embodiment includes:
and S410, determining a target shooting angle corresponding to each target detection point on the shooting device and the first sub-model.
And S420, determining a projection coefficient value corresponding to the current target detection point according to at least one vertex color and/or attribute information corresponding to the current target detection point aiming at each target detection point.
Specifically, when each target detection point on the first submodel needs to be transparently displayed, the projection coefficient value of the current target detection point may be acquired from the vertex color and/or attribute information corresponding to the current target detection point. For example: two vertex colors and attribute information corresponding to the current target detection point are determined, each vertex color corresponds to four channels of RGBA and can store 4 projection coefficient values, the two vertex colors can store 8 projection coefficient values, and the attribute information stores 1 projection coefficient value, so that 9 projection coefficient values corresponding to the current target detection point can be determined, and the 9 projection coefficient values corresponding to the current target detection point are extracted and used in a subsequent reconstruction function.
And S430, processing the projection coefficient value based on a preset spherical harmonic function, and determining a reconstruction function of the current target detection point.
Based on the preset spherical harmonic function, the reconstruction function corresponding to each target detection point can be reconstructed, and for clarity of describing the technical solution of this embodiment, the reconstruction function for reconstructing one of the target detection points can be described as an example.
And the reconstruction function of the target detection point is a function constructed by the distance value corresponding to the physical ray when the physical ray collides with the second sub-model when the current target detection point emits the physical ray to each direction in the space after processing the projection coefficient of the current target detection point according to the spherical harmonic function. The reconstruction function may be configured to process the input target shooting angle to determine distance information between the first sub-model and the second sub-model corresponding to the target detection point at the target shooting angle.
Specifically, the projection coefficient value corresponding to the current target detection point is processed based on each basis function in the preset spherical harmonic function, so that the distance information between the collision point when the physical ray collides with the second sub-model and the current target detection point when the current target detection point emits the physical ray to each direction in the space can be obtained. And constructing a reconstruction function corresponding to the current target detection point according to the distance information of each direction in the space corresponding to the current target detection point.
S440, inputting the target shooting angle corresponding to the current target detection point into a reconstruction function to obtain distance information when the straight line of the current target detection point and the target shooting device intersects with the first sub-model and the second sub-model.
Specifically, after the target shooting angle between the current target detection point and the target shooting device is determined, the target shooting angle may be input into a reconstruction function corresponding to the current target detection point, and the reconstruction function may process the target shooting angle and output distance information between the collision point and the current target detection point when the current target detection point and a straight line to which the target shooting device belongs and a second sub-model collision point under the target shooting angle.
For example, if 9 projection coefficient values corresponding to the current target detection point are obtained, inverse transformation processing is performed through a preset spherical harmonic function according to the 9 projection coefficient values, so that a reconstruction function corresponding to the target detection point can be obtained. The target shooting angle, for example, 45 °, is input into the reconstruction function, and distance information, for example, 5nm, between the first sub-model and the second sub-model corresponding to the current target detection point at the target shooting angle can be determined based on the reconstruction function.
S450, determining the transparency parameter of each target detection point according to the preset corresponding relation between the distance information and the transparency parameter and the distance information corresponding to each target detection point.
The corresponding relationship between each distance information and the corresponding transparency parameter can be stored in advance, for example, every time the transparency parameter of 10nm is increased by ten percent, the distance information is recorded as dist, 0nm < dist < 10nm, the transparency parameter is 100%, 10nm < dist < 20nm, the transparency parameter is 90%, 20nm < dist < 30nm, the transparency parameter is 80%, and the like.
It should be noted that the transparency parameter may include a transparency parameter of the first sub-model and a transparency parameter of the second sub-model.
Specifically, according to the distance information corresponding to each target detection point, the transparency parameter corresponding to the distance information may be determined in the correspondence relationship between each pre-stored distance information and the transparency parameter, and may include the transparency parameter of the first submodel and the transparency parameter of the second submodel, so as to be used in the subsequent transparent display.
And S460, displaying the first sub-model and the second sub-model based on the transparency parameter.
Specifically, after transparency parameters corresponding to the target detection points at the target shooting angle and used for displaying the first submodel and the second submodel are determined, the transparency display effect of the relative positions of the first submodel and the second submodel can be achieved based on the transparency parameters.
According to the technical scheme, the transparency parameter is determined and displayed through the determined target shooting angle and the distance information between the first sub-model and the second sub-model corresponding to the target detection point and the projection coefficient value is called from the vertex color and/or the attribute information, so that the technical problems that when the first sub-model is transparently displayed by using the fixed transparency display value, the transparency display effect is poor and the user experience is poor due to the fact that the transparency display is deviated from the actual situation are solved, the transparency parameter is adjusted according to the actual situation, the transparent display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
EXAMPLE five
Fig. 5 is a schematic structural diagram of an apparatus for determining transparency according to a fifth embodiment of the present invention, where the apparatus includes: a distance information determination module 510, a projection coefficient value determination module 520, and a projection coefficient value storage module 530.
The distance information determining module 510 is configured to determine a distance function of spherical distribution of each target detection point on the first sub-model, where the distance function is determined according to distance information between the target detection point and the second sub-model in each direction; the first sub-model is a model wrapping the second sub-model; a projection coefficient value determining module 520, configured to process each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point on each basis function; the spherical harmonic function is composed of a plurality of basic functions; a projection coefficient value storage module 530, configured to store, for each target detection point, a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point, import a target position in the engine according to an index coordinate of the vertex, determine, with data reconstructed based on the vertex color and/or the attribute information stored in the target position, target distance information between each target detection point and the second model at the target shooting angle, determine, based on the target distance information, a transparency parameter of each target detection point, and display the first sub model and the second sub model based on each transparency parameter.
Optionally, the distance information determining module 510 is further configured to determine, for each target detection point on the first sub-model, information of each collision point to be processed when the current target detection point transmits a physical ray to each direction and passes through the second sub-model; determining distance information between the current target detection point and the second submodel in each direction according to the information of the current target detection point and each collision point to be processed; and determining the distance function of the spherical distribution of the corresponding target detection points according to the distance information of each target detection point in each direction.
Optionally, the distance information determining module 510 is further configured to use the current target detection point as a sphere center, transmit physical rays to any spatial direction, and determine information of collision points to be processed when each physical ray passes through the second sub-model.
Optionally, the distance information determining module 510 is further configured to determine, when there is a collision point to be processed between the physical ray and the second sub-model, distance information between the collision point to be processed and the current target detection point; when the physical ray and the second sub-model do not have a collision point to be processed, setting distance information corresponding to the collision point to be processed as a set value; and determining the distance information between the current target detection point and the second submodel in each direction according to the distance information or the set value corresponding to each collision point to be processed.
Optionally, the projection coefficient value determining module 520 is further configured to determine an order of the spherical harmonic function, and determine a representation manner of basis functions in the spherical harmonic function and a number of the basis functions according to the order; for each target detection point, processing the distance function of the current target detection point based on each basis function to obtain the projection coefficient value of the current target detection point; the number of projection coefficient values is the same as the number of basis functions.
Optionally, the projection coefficient value storage module 530 is further configured to determine a target number of projection coefficient values corresponding to the current target detection point; determining the number of vertex colors corresponding to the current target detection point based on the number of targets and the storage number corresponding to the vertex colors; and storing the projection coefficient value into the vertex color corresponding to the current target detection point.
Optionally, the projection coefficient value storage module 530 is further configured to determine a target number of projection coefficient values corresponding to the current target detection point; determining the preset number of the storage projection coefficient values according to the vertex color of the current target detection point; and storing the residual projection coefficient values into the attribute information of the vertex colors according to the target number and the preset number.
Optionally, the projection coefficient value storage module 530 is further configured to import the vertex color and/or the attribute information into a target location in the engine according to the coordinate information for storage.
According to the technical scheme of the embodiment, the distance function of the spherical distribution of each target detection point on the first sub-model is determined, the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and is stored into the vertex color and/or the attribute information, so that the transparency parameter at the target shooting angle is determined and displayed according to the stored information and the reconstruction function, the technical problems that the transparency display effect is poor and the user experience is poor due to the fact that the transparency display deviates from the actual situation when the first sub-model is transparently displayed by the fixed transparency display value are solved, the transparency parameter is adjusted according to the actual situation, the transparency display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
The device for determining the transparency provided by the embodiment of the invention can execute the method for determining the transparency provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the apparatus for determining transparency are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE six
Fig. 6 is a schematic structural diagram of an apparatus for determining transparency according to a sixth embodiment of the present invention, where the apparatus includes: a target photographing angle determining module 610, a distance information determining module 620 and a transparent display module 630.
The target shooting angle determining module 610 is configured to determine a target shooting angle corresponding to each target detection point on the shooting device and the first sub-model; a distance information determining module 620, configured to, for each target detection point, retrieve a projection coefficient value corresponding to the current target detection point from the vertex color and/or attribute information corresponding to the current target detection point, and determine distance information between a first sub-model and a second sub-model corresponding to the current target detection point according to the projection coefficient value and a target shooting angle; the first sub-model is a model wrapping the second sub-model; the projection coefficient value stored in the vertex color and/or the attribute information is determined after the distance function of the spherical distribution of each target detection point on the first sub-model is processed on the basis of the spherical harmonic function; the transparent display module 630 is configured to determine a transparency parameter of each target detection point according to the distance information corresponding to each target detection point, and display the first submodel and the second submodel based on the transparency parameter.
Optionally, the distance information determining module 620 is further configured to, for each target detection point, determine a projection coefficient value corresponding to the current target detection point according to at least one vertex color and/or attribute information corresponding to the current target detection point; processing the projection coefficient value based on a preset spherical harmonic function, and determining a reconstruction function of a current target detection point; and inputting the target shooting angle corresponding to the current target detection point into a reconstruction function to obtain distance information when the current target detection point is intersected with the straight line of the target shooting device and the first sub-model and the second sub-model.
Optionally, the transparent display module 630 is further configured to determine the transparency parameter of each target detection point according to a preset correspondence between the distance information and the transparency parameter and the distance information corresponding to each target detection point; displaying the first sub-model and the second sub-model based on the transparency parameter.
According to the technical scheme, the transparency parameter is determined and displayed through the determined target shooting angle and the distance information between the first sub-model and the second sub-model corresponding to the target detection point and the projection coefficient value is called from the vertex color and/or the attribute information, so that the technical problems that the transparent display effect is poor and the user experience is poor due to the fact that the transparent display deviates from the actual situation when the first sub-model is transparently displayed by the fixed transparency display value are solved, the transparency parameter is adjusted according to the actual situation, the transparent display effect is consistent with the actual theoretical effect, and the technical effect of the user experience is improved.
The device for determining the transparency provided by the embodiment of the invention can execute the method for determining the transparency provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the apparatus for determining transparency are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE seven
Fig. 7 is a schematic structural diagram of an electronic device according to a seventh embodiment of the present invention. FIG. 7 illustrates a block diagram of an exemplary electronic device 70 suitable for use in implementing embodiments of the present invention. The electronic device 70 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 7, the electronic device 70 is embodied in the form of a general purpose computing device. The components of the electronic device 70 may include, but are not limited to: one or more processors or processing units 701, a system memory 702, and a bus 703 that couples various system components including the system memory 702 and the processing unit 701.
Bus 703 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 70 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 70 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 702 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)704 and/or cache memory 705. The electronic device 70 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, the storage system 706 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard drive"). Although not shown in FIG. 7, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 703 via one or more data media interfaces. Memory 702 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 708 having a set (at least one) of program modules 707 may be stored, for example, in memory 702, such program modules 707 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 707 generally perform the functions and/or methodologies of the described embodiments of the invention.
The electronic device 70 may also communicate with one or more external devices 709 (e.g., keyboard, pointing device, display 710, etc.), one or more devices that enable a user to interact with the electronic device 70, and/or any device (e.g., network card, modem, etc.) that enables the electronic device 70 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 711. Also, the electronic device 70 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 712. As shown, the network adapter 712 communicates with the other modules of the electronic device 70 over a bus 703. It should be appreciated that although not shown in FIG. 7, other hardware and/or software modules may be used in conjunction with electronic device 70, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 701 executes various functional applications and data processing, for example, implementing the method for determining transparency provided by the embodiment of the present invention, by executing a program stored in the system memory 702.
Example eight
An eighth embodiment of the present invention also provides a storage medium containing computer-executable instructions which, when executed by a computer processor, perform a method of determining transparency.
The method comprises the following steps:
respectively determining a distance function of spherical distribution of each target detection point on the first submodel, wherein the distance function is determined according to the distance information between the target detection points and the second submodel in each direction; the first sub-model is a model wrapping the second sub-model;
processing each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basic functions;
and for each target detection point, storing a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point, importing the projection coefficient value into a target position in an engine according to index coordinates of the vertex, determining target distance information between each target detection point and the second model under a target shooting angle according to data reconstructed based on the vertex color and/or the attribute information stored in the target position, determining a transparency parameter of each target detection point based on the target distance information, and displaying the first sub model and the second sub model based on each transparency parameter.
Or, the method comprises:
determining a target shooting angle corresponding to each target detection point on the shooting device and the first submodel;
for each target detection point, a projection coefficient value corresponding to the current target detection point is called from the vertex color and/or attribute information corresponding to the current target detection point, and distance information between a first sub-model and a second sub-model corresponding to the current target detection point is determined according to the projection coefficient value and the target shooting angle; the first sub-model is a model wrapping the second sub-model; the projection coefficient value stored in the vertex color and/or the attribute information is determined after the distance function of the spherical distribution of each target detection point on the first sub-model is processed on the basis of the spherical harmonic function;
and determining the transparency parameter of each target detection point according to the distance information corresponding to each target detection point, and displaying the first submodel and the second submodel based on the transparency parameter.
Computer storage media for embodiments of the present invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (15)

1. A method of determining transparency, comprising:
respectively determining a distance function of spherical distribution of each target detection point on the first submodel, wherein the distance function is determined according to distance information between the target detection point and the second submodel in each direction; the first sub-model is a model wrapping the second sub-model;
processing each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basic functions;
and for each target detection point, storing a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point, importing the projection coefficient value into a target position in an engine according to index coordinates of a vertex, determining target distance information between each target detection point and the second model under a target shooting angle according to data reconstructed based on the vertex color and/or the attribute information stored in the target position, determining transparency parameters of each target detection point based on the target distance information, and displaying the first sub model and the second sub model based on each transparency parameter.
2. The method of claim 1, wherein the separately determining a distance function for the spherical distribution of each target detection point on the first submodel comprises:
for each target detection point on the first submodel, determining information of each collision point to be processed when the current target detection point transmits physical rays to each direction and penetrates through the second submodel;
determining distance information between the current target detection point and the second submodel in each direction according to the information of the current target detection point and each collision point to be processed;
and determining the distance function of the spherical distribution of the corresponding target detection points according to the distance information of each target detection point in each direction.
3. The method of claim 2, wherein the determining, for each target detection point on the first submodel, information of each collision point to be processed when the current target detection point transmits a physical ray to each direction and passes through the second submodel comprises:
and taking the current target detection point as a sphere center, transmitting physical rays to any spatial direction, and determining the information of collision points to be processed when each physical ray penetrates through the second submodel.
4. The method according to claim 2, wherein the determining distance information between the current target detection point and the second submodel in each direction according to the information of the current target detection point and each collision point to be processed comprises:
when the physical ray and the second sub-model have collision points to be processed, determining the distance information between the collision point information to be processed and the current target detection point;
when the physical ray and the second sub-model do not have a collision point to be processed, setting distance information corresponding to the collision point to be processed as a set value;
and determining the distance information between the current target detection point and the second submodel in each direction according to the distance information or the set value corresponding to each collision point to be processed.
5. The method of claim 1, wherein the processing each distance function based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point comprises:
determining the order of the spherical harmonic function, and determining the representation mode of the basis functions in the spherical harmonic function and the number of the basis functions according to the order;
for each target detection point, processing a distance function of the current target detection point based on each basis function to obtain a projection coefficient value of the current target detection point; the number of projection coefficient values is the same as the number of basis functions.
6. The method according to claim 1, wherein the storing the projection coefficient value corresponding to the current target detection point into the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point comprises:
determining the target number of the projection coefficient values corresponding to the current target detection points;
determining the number of vertex colors corresponding to the current target detection points based on the number of targets and the number of storages corresponding to the vertex colors;
and storing the projection coefficient value into the vertex color corresponding to the current target detection point.
7. The method according to claim 1, wherein the storing the projection coefficient value corresponding to the current target detection point into the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point comprises:
determining the target number of the projection coefficient values corresponding to the current target detection points;
determining the preset number of the stored projection coefficient values according to the vertex color of the current target detection point;
and storing the residual projection coefficient value into the attribute information of the vertex color according to the target number and the preset number.
8. The method of claim 1, wherein importing the target location in the engine according to the indexed coordinates of the vertices comprises:
and importing the vertex color and/or the attribute information into a target position in an engine for storage according to the coordinate information.
9. A method of determining transparency, comprising:
determining a target shooting angle corresponding to each target detection point on the shooting device and the first submodel;
for each target detection point, a projection coefficient value corresponding to the current target detection point is called from the vertex color and/or attribute information corresponding to the current target detection point, and distance information between a first sub-model and a second sub-model corresponding to the current target detection point is determined according to the projection coefficient value and a target shooting angle; the first sub-model is a model wrapping the second sub-model; the projection coefficient value stored in the vertex color and/or attribute information is determined after the distance function of the spherical distribution of each target detection point on the first sub-model is processed on the basis of the spherical harmonic function;
and determining the transparency parameter of each target detection point according to the distance information corresponding to each target detection point, and displaying the first submodel and the second submodel based on the transparency parameter.
10. The method according to claim 9, wherein for each target detection point, retrieving a projection coefficient value corresponding to the current target detection point from vertex color and/or attribute information corresponding to the current target detection point, and determining distance information between the first sub-model and the second sub-model corresponding to the current target detection point according to the projection coefficient value and a target shooting angle comprises:
for each target detection point, determining a projection coefficient value corresponding to the current target detection point according to at least one vertex color and/or attribute information corresponding to the current target detection point;
processing the projection coefficient value based on a preset spherical harmonic function, and determining a reconstruction function of the current target detection point;
and inputting the target shooting angle corresponding to the current target detection point into the reconstruction function to obtain the distance information when the straight line of the current target detection point and the target shooting device intersects with the first sub-model and the second sub-model.
11. The method of claim 9, wherein the determining a transparency parameter of each target detection point according to the distance information corresponding to each target detection point, and displaying the first submodel and the second submodel based on the transparency parameter comprises:
determining the transparency parameter of each target detection point according to the preset corresponding relation between the distance information and the transparency parameter and the distance information corresponding to each target detection point;
displaying the first sub-model and the second sub-model based on the transparency parameter.
12. An apparatus for determining transparency, comprising:
the distance information determining module is used for respectively determining a distance function of the spherical distribution of each target detection point on the first submodel, and the distance function is determined according to the distance information between the target detection points and the second submodel in each direction; the first sub-model is a model wrapping the second sub-model;
the projection coefficient value determining module is used for processing each distance function based on a preset spherical harmonic function to obtain the projection coefficient value of each target detection point on each basis function; the spherical harmonic function is composed of a plurality of basic functions;
and the projection coefficient value storage module is used for storing a projection coefficient value corresponding to the current target detection point into a vertex color corresponding to the current target detection point and/or attribute information of the current target detection point for each target detection point, importing the target position in an engine according to the index coordinate of the vertex, determining target distance information between each target detection point and the second model under a target shooting angle by using data reconstructed based on the vertex color and/or the attribute information stored in the target position, determining transparency parameters of each target detection point based on the target distance information, and displaying the first sub model and the second sub model based on each transparency parameter.
13. An apparatus for determining transparency, comprising:
the target shooting angle determining module is used for determining a target shooting angle corresponding to each target detection point on the shooting device and the first sub-model;
the distance information determining module is used for calling a projection coefficient value corresponding to the current target detection point from the vertex color and/or the attribute information of the current target detection point aiming at each target detection point, and determining the distance information between a first sub-model and a second sub-model corresponding to the current target detection point according to the projection coefficient value and the target shooting angle; the first sub-model is a model wrapping the second sub-model; the projection coefficient value stored in the vertex color and/or attribute information is determined after the distance function of the spherical distribution of each target detection point on the first sub-model is processed on the basis of the spherical harmonic function;
and the transparent display module is used for determining the transparency parameter of each target detection point according to the distance information corresponding to each target detection point and displaying the first submodel and the second submodel based on the transparency parameter.
14. An electronic device, the electronic device comprising:
one or more processors;
a storage device to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method of determining transparency as recited in any one of claims 1-8 or 9-11.
15. A storage medium containing computer executable instructions for performing a method of determining transparency as claimed in any one of claims 1 to 8 or 9 to 11 when executed by a computer processor.
CN202011444046.4A 2020-12-08 2020-12-08 Method and device for determining transparency, electronic equipment and storage medium Pending CN114627231A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011444046.4A CN114627231A (en) 2020-12-08 2020-12-08 Method and device for determining transparency, electronic equipment and storage medium
PCT/CN2021/131497 WO2022121652A1 (en) 2020-12-08 2021-11-18 Transparency determination method and apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011444046.4A CN114627231A (en) 2020-12-08 2020-12-08 Method and device for determining transparency, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114627231A true CN114627231A (en) 2022-06-14

Family

ID=81896197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011444046.4A Pending CN114627231A (en) 2020-12-08 2020-12-08 Method and device for determining transparency, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114627231A (en)

Similar Documents

Publication Publication Date Title
CN108010112B (en) Animation processing method, device and storage medium
US11263803B2 (en) Virtual reality scene rendering method, apparatus and device
US20170154468A1 (en) Method and electronic apparatus for constructing virtual reality scene model
CN111882634B (en) Image rendering method, device, equipment and storage medium
WO2020248900A1 (en) Panoramic video processing method and apparatus, and storage medium
US11069129B2 (en) Shader binding management in ray tracing
WO2022121653A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
CN113379885B (en) Virtual hair processing method and device, readable storage medium and electronic equipment
EP4290464A1 (en) Image rendering method and apparatus, and electronic device and storage medium
CN111798554A (en) Rendering parameter determination method, device, equipment and storage medium
CN114004972A (en) Image semantic segmentation method, device, equipment and storage medium
CN106886974A (en) Image accelerator equipment and correlation technique
CN116543094A (en) Model rendering method, device, computer readable storage medium and electronic equipment
CN116152416A (en) Picture rendering method and device based on augmented reality and storage medium
CN114627231A (en) Method and device for determining transparency, electronic equipment and storage medium
CN113694518B (en) Freezing effect processing method and device, storage medium and electronic equipment
US20240046554A1 (en) Presenting virtual representation of real space using spatial transformation
WO2022121652A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
WO2022121654A1 (en) Transparency determination method and apparatus, and electronic device and storage medium
CN112465692A (en) Image processing method, device, equipment and storage medium
Fu et al. Dynamic shadow rendering with shadow volume optimization
CN114612603A (en) Method and device for determining transparency, electronic equipment and storage medium
Jung et al. Fast and efficient vertex data representations for the web
CN109919121B (en) Human body model projection method and device, electronic equipment and storage medium
CN115714888B (en) Video generation method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination