WO2022121654A1 - Procédé et appareil de détermination de transparence, et dispositif électronique et support d'enregistrement - Google Patents

Procédé et appareil de détermination de transparence, et dispositif électronique et support d'enregistrement Download PDF

Info

Publication number
WO2022121654A1
WO2022121654A1 PCT/CN2021/131500 CN2021131500W WO2022121654A1 WO 2022121654 A1 WO2022121654 A1 WO 2022121654A1 CN 2021131500 W CN2021131500 W CN 2021131500W WO 2022121654 A1 WO2022121654 A1 WO 2022121654A1
Authority
WO
WIPO (PCT)
Prior art keywords
target detection
detection point
point
sub
model
Prior art date
Application number
PCT/CN2021/131500
Other languages
English (en)
Chinese (zh)
Inventor
冯乐乐
Original Assignee
上海米哈游天命科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海米哈游天命科技有限公司 filed Critical 上海米哈游天命科技有限公司
Publication of WO2022121654A1 publication Critical patent/WO2022121654A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the embodiments of the present application relate to the technical field of games, for example, to a method, apparatus, electronic device, and storage medium for determining transparency.
  • the translucent effect between the inner model and the outer model is usually set, for example, the translucent display of the skin model and the clothes model.
  • the translucent display mainly depends on the effect displayed after the light reflected by the inner layer object penetrates the outer layer object after a certain distance and is incident on the human eye.
  • each model is composed of points one by one. When the distance between the point on the inner layer object and the corresponding point on the outer layer object is farther, the translucent effect is weaker, and vice versa, the translucent effect is relatively strong.
  • determining the translucent effect is mainly to set the transparency display value of the outer layer model.
  • the transparency display value is usually fixed, and all points on the outer layer model realize the transparent display according to the set transparency display value.
  • the transparent display effect produced by this method has a certain deviation from the actual situation, the transparent display effect is not good, and the user experience is poor.
  • Embodiments of the present application provide a method, apparatus, electronic device, and storage medium for determining transparency, so that the transparent display effect is consistent with reality and user experience is improved.
  • an embodiment of the present application provides a method for determining transparency, the method comprising:
  • the target reconstruction function For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point
  • the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
  • the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is displayed based on the attribute information.
  • the embodiments of the present application also provide a device for determining transparency, including:
  • a relative shooting angle information determination module configured to determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model
  • the distance information determination module is configured to determine the target reconstruction function corresponding to the current target detection point for each target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function , to obtain the distance information between the first sub-model and the second sub-model of the target detection point under the current shooting angle information;
  • the first sub-model is a sub-model that wraps a part of the second sub-model;
  • reconstruction The function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function;
  • the transparency information determination module is configured to determine the transparency information of the to-be-collision point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent;
  • the collision point is the collision point corresponding to the target detection point on the second sub-model;
  • the transparent display module is configured to adjust the attribute information of each to-be-collision point based on the transparency information of each to-be-collision point, so that each to-be-collision point on the second sub-model is displayed based on the attribute information.
  • an embodiment of the present application also provides an electronic device, the electronic device comprising:
  • processors one or more processors
  • storage means arranged to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining transparency described in any one of the embodiments of the present application.
  • the embodiments of the present application further provide a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform the determination described in any one of the embodiments of the present application method of transparency.
  • FIG. 1 is a schematic flowchart of a method for determining transparency according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a device for determining transparency according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1 is a schematic flowchart of a method for determining transparency provided by an embodiment of the present application.
  • This embodiment can be applied to determine a target detection point corresponding to each target detection point according to a pre-stored projection coefficient value.
  • the target reconstruction function is to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point at the corresponding visual angle according to the target reconstruction function of each target detection point, and then based on the situation of transparent display of the distance information,
  • the method may be performed by an apparatus for determining transparency, the apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, for example, the electronic device may be a mobile terminal or the like.
  • the first sub-model and the second sub-model are relative.
  • the sub-models can be a skin model and a clothes model, and the model corresponding to the clothes can be used as the first sub-model. model as the second submodel.
  • the target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; of course, it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point.
  • the photographing device is a device for observing and photographing the first sub-model
  • the relative photographing angle is the relative angle between the photographing device and each target detection point on the first sub-model.
  • the relative shooting angle information is determined according to the position information between the shooting device and each target detection point.
  • the relative position of the shooting device and each target detection point can be determined.
  • angle information, and the angle information can be used as relative shooting angle information.
  • the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function.
  • Each object detection point has a corresponding object reconstruction function.
  • the reconstruction function of reconstructing one of the target detection points may be used as an example for introduction.
  • the reconstruction function of the target detection point is to process the projection coefficient of the current target detection point according to the spherical harmonic function, so as to obtain when the current target detection point emits physical rays in each direction in space, when the physical rays collide with the second sub-model
  • the function constructed from the corresponding distance value.
  • the reconstruction function can be used to process the input relative shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point under the relative shooting angle.
  • the projection coefficient value corresponding to the current target detection point based on each basis function in the preset spherical harmonic functions, it can be obtained that when the current target detection point emits physical rays in each direction in space, the physical ray and the first The distance information between the collision point and the current target detection point when the two sub-models collide. According to the distance information of each direction in the space corresponding to the current target detection point, a reconstruction function corresponding to the current target detection point is constructed.
  • the target shooting angle can be input into the reconstruction function corresponding to the current target detection point.
  • the reconstruction function can process the target shooting angle and output the target shooting angle. The distance information between the collision point and the current target detection point when the current target detection point and the line belonging to the target photographing device collide with the second sub-model under the angle.
  • the relative shooting angle can be input to the target detection point A.
  • the distance information before the first sub-model and the second sub-model corresponding to the target detection point A at this angle can be obtained.
  • the obtained distance information is 5 nm.
  • the translucent display can be understood as the material used in the first sub-model is a translucent material.
  • the distance information corresponding to each target detection point may be the same or different.
  • the distance information corresponding to each target detection point refers to the distance between the target detection point and the point to be collided when the line to which the photographing device and the target detection point belong collide with the second sub-model.
  • Transparency information is used to indicate the degree of transparency when the model is displayed, which can be expressed as a percentage, for example: the transparency is 80% and so on.
  • the same method is used to determine the transparency parameters of different target detection points.
  • the transparency parameters of one of the target detection points are determined as an example to introduce.
  • determining the transparency parameter of the current target detection point may be: according to the distance information, between the pre-stored distance information and the transparency parameter
  • the transparency parameter is determined in the corresponding relationship between the two, or it can be calculated according to a preset transparency parameter calculation model, and the distance information is input into the transparency parameter calculation model, and the transparency parameter corresponding to the distance information can be obtained through calculation.
  • f is a monotonic and monotonically decreasing function.
  • the target detection points on the first sub-model can be displayed according to the corresponding transparency parameters, and the The intersection of the target detection point and the line to which the target photographing device belongs and the second sub-model is displayed according to the corresponding transparency parameter, so as to obtain the effect of transparent display.
  • the transparency parameter of each target detection point is determined, and the effect of transparently displaying the first sub-model and the second sub-model can be realized.
  • the transparency parameter corresponding to the distance information may be determined in the pre-stored correspondence between each distance information and the transparency parameter, which may include the transparency parameter of the first sub-model and The transparency parameter of the second submodel for subsequent transparent display.
  • the to-be-to-be-detected points corresponding to each target detection point on the first sub-model can be adjusted based on the above-mentioned transparency parameters. Attribute information of the collision point.
  • the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point, including: when the first sub-model is detected When the two sub-models are displayed semi-transparently, according to the distance information corresponding to each target detection point, and according to the corresponding relationship between the pre-established distance information and transparency information, determine the transparency information of the to-be-collision point corresponding to each target detection point .
  • the corresponding relationship between the distance information and the transparency parameter may be established in advance, and the transparency parameter of each target detection point may be determined according to the corresponding relationship. For example, when the distance information is 5mm, the transparency parameter is determined to be 0.5 according to the corresponding relationship; when the distance information is 10mm, the transparency parameter is determined to be 0.2 according to the corresponding relationship.
  • the transparency information corresponding to each target detection point can be determined according to the corresponding relationship between the distance information and the transparency parameter.
  • the point to be collided is a point on the second sub-model.
  • the point to be collided can be obtained by colliding with the second sub-model. That is, the point to be collided is a point corresponding to each target detection point on the second sub-model under the relative shooting angle of each target detection point.
  • the attribute information can be the color attribute of the point to be collided.
  • the color attribute of the second collision point corresponding to the target detection point can be adjusted according to the transparency information. For example, the greater the distance information, the darker the color of the point to be collided, and vice versa. In this way, when the outer layer is semi-transparently displayed, the color depth of the to-be-collision point corresponding to each target detection point can be adjusted, thereby realizing transparent display.
  • the distance information between the first sub-model and the second sub-model corresponding to the target detection point is obtained by determining the target shooting angle and retrieving the projection coefficient value from the vertex color and/or attribute information.
  • the transparency parameter and display it which solves the technical problem of poor transparent display effect and poor user experience caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value. Adjust the transparency parameter according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
  • the attribute information of each point to be collided is adjusted based on the transparency information of each point to be collided, so that each point to be collided on the second sub-model is displayed based on the attribute information
  • the method includes: adjusting the color depth value of each point to be collided according to the transparency information of each point to be collided, so as to perform transparent display based on the color depth value of each point to be collided on the second sub-model.
  • the attribute information may be a color depth value.
  • the color depth of the second collision point corresponding to each target detection point can be adjusted according to the transparency parameter corresponding to each target detection point. For example, the closer the distance, the color depth. The larger the value, the lighter the color of the point to be collided with.
  • the color attributes corresponding to different transparency parameters can be set according to the actual situation. After the transparency parameter is determined, the color depth value of each point to be collided can be adjusted according to the color depth value corresponding to the transparency parameter.
  • the attribute information of the first sub-model and the second sub-model can be obtained, combining the attribute information and the distance information,
  • the transparency information corresponding to each target detection point is determined comprehensively, so that the first sub-model and the second sub-model are transparently displayed.
  • FIG. 2 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • a reconstruction function can be reconstructed according to the spherical harmonic function and the projection coefficient value, so as to determine each target based on the reconstruction function
  • the specific implementation can refer to the following specific description. Wherein, the explanations of terms that are the same as or corresponding to the above embodiments are not repeated here.
  • the method for determining transparency includes:
  • S210 Determine a target reconstruction function corresponding to each target detection point according to the pre-received projection coefficient values of each base function in the spherical harmonic function of each target detection point.
  • the target reconstruction function is used to process the input current shooting angle to obtain distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs.
  • Each object detection point has its corresponding object reconstruction function.
  • ray is emitted at each target detection point, physical ray can be emitted to each direction in space, and the distance information between each physical ray and the first sub-model and the second sub-model can be determined.
  • the distance information is processed, and all distances can be compressed into several function values, which can be used as projection coefficient values.
  • the spherical function corresponding to the target detection point can be reconstructed, so that the relative shooting angle can be input into the spherical function, that is, the reconstruction function, and the angle can be obtained. distance information below.
  • the pre-stored projection coefficient value corresponding to each target detection point can be obtained, and the projection coefficient value can be processed based on the spherical harmonic function, and the target reconstruction function corresponding to each target detection point can be reconstructed, and the target reconstruction function can be A function of determining the distance information between the first sub-model and the second sub-model at each relative shooting angle.
  • S220 Determine relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model.
  • each target detection point determines a target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target
  • the detection point is the distance information between the first sub-model and the second sub-model under the current shooting angle information.
  • the projection coefficient values corresponding to each target detection point can be reconstructed according to the spherical harmonic function to obtain the reconstruction function of each target detection point, and the relative shooting angle corresponding to each target detection point is input
  • the distance information between the first sub-model and the second sub-model corresponding to each target detection point can be obtained, and the transparency parameter information can be further determined in combination with the distance information, and then determined according to the distance information and each target detection point.
  • the corresponding color attribute dynamically adjusts the color attribute of each point to be collided on the second sub-model, so that the model is transparently displayed.
  • FIG. 3 is a schematic flowchart of a method for determining transparency according to another embodiment of the present application.
  • the method further includes: determining the projection coefficient of each target detection point value.
  • the method includes:
  • determining the projection coefficient value of each target detection point may be: for each target detection point on the first sub-model, determining the current target detection point emits physical rays in all directions when it passes through the second sub-model.
  • the distance values of the current target collision in each direction are determined according to the current target detection point and the information of each collision point to be processed.
  • a distance function corresponding to each target detection point is determined.
  • the distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained; the spherical harmonic function includes a plurality of basis functions.
  • the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model.
  • the target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; of course, it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point.
  • the distance function corresponds to each target detection point, that is, a target detection point corresponds to a distance function.
  • a target detection point corresponds to a distance function.
  • each distance function it can be understood as: it can be determined according to the relative distance information between a certain target detection point and the second sub-model in each direction in space.
  • the distance between each target detection point and the second sub-model in each direction can be: take the target detection point as the center of the sphere, emit physical rays in each direction in space, and determine each Distance information between the intersection of the physical ray and the second sub-model and the target detection point.
  • each distance function can be determined based on the distance between a certain target detection point and the second sub-model in each direction. distance function.
  • the same method is used to determine the spherical distribution distance function of different target detection points.
  • the spherical distribution function of one of the target detection points is determined as an example to introduce.
  • a target detection point on the first sub-model can be used as the center of the sphere, physical rays can be emitted in every direction in space, and the distance information between the intersection of each physical ray and the second sub-model and the target detection point can be determined. If the physical ray has an intersection with the second sub-model in the direction of the ray, the distance between the intersection and the target detection point is used as the distance information between the target detection point and the second sub-model in the direction. If the spherical ray does not have an intersection with the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in this direction.
  • the distance function of the spherical distribution of the target detection point can be determined.
  • the spherical distribution distance function of the target detection point A is:
  • i represents the ith direction
  • F(i) represents the distance information in the ith direction
  • dist_i represents the specific distance information
  • n represents the total number of directions.
  • the distance function of the spherical distribution of each target detection point is a composite function
  • the number of sub-functions in the composite function can be determined according to the preset number of samples.
  • the default can be 16 ⁇ 32 precision, that is, The composite function contains 512 sub-functions.
  • the specific sampling quantity can be determined according to actual needs.
  • the same method is used to determine the projection coefficient values of different target detection points.
  • the projection coefficient value of one of the target detection points is determined as an example.
  • the spherical harmonic function is composed of multiple basis functions, and each distance function can be processed by using the basis function. Different orders of spherical harmonics correspond to different numbers of basis functions.
  • the second-order spherical harmonics include 4 basis functions, the third-order spherical harmonics include 9 basis functions, and the fourth-order spherical harmonics include 16 basis functions.
  • the projection coefficient value is the coefficient value obtained by processing the distance function according to the basis function in the spherical harmonic function, and the number of coefficient values is the same as that of the basis function. According to the spherical harmonic function and its corresponding multiple basis functions, the spherical distribution distance function of the target detection point can be compressed to obtain the projection coefficient value.
  • the spherical harmonic function is second-order, that is, it includes 4 basis functions, then inputting the distance function corresponding to the target detection point into the spherical harmonic function can obtain 4 projection coefficient values, so as to compress the distance function into 4 projection coefficient values.
  • the higher the order of the spherical harmonic function the higher the similarity between the reconstructed sphere and the actual sphere during reconstruction, so developers can choose spherical harmonic functions of different orders according to actual needs.
  • the higher-order spherical harmonic function contains more basis functions, and the degree of restoration of the distance function is higher in the subsequent restoration of the distance function according to the spherical harmonic function and the projection coefficient value.
  • the distance function of the spherical distribution of the target detection point can be input into the spherical harmonic function, and the distance function is processed based on the basis function in the spherical harmonic function to obtain the above distance function.
  • the projection coefficient values on each basis function It follows that the number of projection coefficient values is equal to the number of basis functions in spherical harmonics.
  • the distance functions of the spherical distribution of different target detection points can be the same or different.
  • the distance functions corresponding to different target detection points are the same.
  • the distance functions corresponding to different target detection points are different, and whether the distance function is the same is based on the target detection points in each direction in space. The distance information on it is determined. After inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
  • the method further includes: storing the projection coefficient value in at least one image, so as to obtain the corresponding target detection point from the at least one image. Projection coefficient value, so as to reconstruct the corresponding objective reconstruction function.
  • the projection coefficient value of each target detection point is correspondingly stored in at least one image according to the spatial position relationship, and the at least one image is stored in the target position, so as to obtain all target detection points in the target position from the target position.
  • the value of the projection coefficient on the corresponding spherical harmonic function is correspondingly stored in at least one image according to the spatial position relationship, and the at least one image is stored in the target position, so as to obtain all target detection points in the target position from the target position.
  • the method of storing the projection coefficient values of different target detection points is the same, and it can be introduced by taking the storage of the projection coefficient value corresponding to a certain target detection point as an example.
  • the number of basis functions can be four, and after the distance function is processed by the basis functions in the spherical harmonic function, four projection coefficient values can be obtained.
  • the projection coefficient values can be stored in the image according to the spatial coordinate position information. For example, if there are four projection coefficient values, four images can be used to store the projection coefficient values, and each pixel in each image stores a projection coefficient value. At least one image may be stored in the engine, and the location stored in the engine may be used as the target location.
  • the advantage of storing the image to the target location is that the stored projection coefficient values can be obtained from the target location so that a reconstruction function can be constructed from the projection coefficient values.
  • the number of images may be determined according to the number of projection coefficient values, and the projection coefficient values may be stored in corresponding pixel points in the image according to the spatial coordinate information.
  • each target detection point has 9 projection coefficient values.
  • nine images can be determined, and the nine projection coefficient values of each target detection point can be stored in the corresponding pixel points in each image according to the spatial coordinate information.
  • the projection coefficient value can also be stored in the vertex color.
  • Each target detection point on the first sub-model can be used as a vertex color, each target detection point corresponds to a pixel channel, that is, four RGBA channels, and the projection coefficient value of the target detection point can be stored in each channel.
  • the attribute information can be scalable information corresponding to each target detection point, for example: UV, ie u, v texture map coordinates.
  • the vertex color corresponding to the target detection point and the attribute information of the current target detection point can be used together to store the projection coefficient value.
  • the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point
  • four projection coefficient values can be stored respectively according to the four channels of R, G, B and A, and , the number of vertex colors required can be determined according to the number of projection coefficient values. It is also possible to store 4 projection coefficient values in one vertex color, and store the remaining projection coefficient values in the UV coordinates corresponding to the current target detection point. The above method can reduce the use of vertex colors and facilitate the calculation of projection coefficient values. storage and subsequent recall and use.
  • the vertex color corresponding to the target detection point and the UV coordinate corresponding to the current target detection point can also be used together to store the projection coefficient value. For example, if 9 projection coefficient values are stored, two vertex colors can be used to store 8 projection coefficient value, and store the remaining 1 projection coefficient value in the UV coordinate corresponding to the current target detection point.
  • the projection coefficient value After the projection coefficient value is stored, it can be imported into the target position in the engine according to the index coordinates of the current target detection point for storage, so that the projection coefficient value corresponding to the current target detection point can be retrieved from the engine according to the target detection point.
  • the target distance information reconstructed by the coefficient value determines the transparency parameter of each target detection point, and then displays the first sub-model and the second sub-model based on each transparency parameter.
  • For each target detection point retrieve at least one image from the target position, determine the target pixel position of the current target detection point in the at least one image, and obtain the current target detection point according to the retrieved projection coefficient values The target reconstruction function corresponding to the point.
  • each target detection point on the first sub-model needs to be displayed transparently, the distance information of each target detection point at the relative shooting angle needs to be determined, and then the transparency parameter is determined according to the distance information.
  • the reconstruction function corresponding to each target detection point can be reconstructed according to the projection coefficient value stored at the target position, so that the relative shooting angle can be input into the corresponding reconstruction function , to obtain the distance information corresponding to each target detection point.
  • the spatial coordinate information of the current target detection point can be obtained, and the target pixel position of the current target detection point in each image can be determined according to the spatial coordinate information.
  • the pixel position gets the stored projection coefficient value. Reconstruct each stored projection coefficient value according to the spherical harmonic function, and obtain when the current target detection point emits physical rays in each direction in the space, when the current target detection point corresponding to each physical ray collides with the second sub-model, the current target detection point
  • the distance information between the detection point and the point to be collided, and the distance function obtained according to each distance information, that is, the target reconstruction function corresponding to each target detection point is the distance function.
  • the projection coefficient value corresponding to each target detection point is pre-determined, and the projection coefficient value is stored in a preset number of maps, so that when the transparent display is performed, it can be retrieved from the corresponding position
  • the stored projection coefficient value and then reconstruct a reconstruction function based on the projection coefficient value, so as to determine the distance information of each target detection point under different relative shooting angles based on the reconstruction function.
  • FIG. 4 is a schematic flowchart of a method for determining transparency according to another embodiment of the present application.
  • the foregoing embodiment for the specific determination method of determining the distance function, reference may be made to the technical solution of this embodiment. The explanations of terms that are the same as or corresponding to the above embodiments are not repeated here.
  • the method includes:
  • the information of each collision point to be processed may include position information of each collision point to be collided, such as spatial coordinate information and the like. For example, take the current target detection point as the center of the sphere, emit physical rays in any direction in space, record the collision points to be processed when each physical ray intersects the second sub-model, and determine the collision points to be processed and the target detection point. Distance value between points.
  • physical rays may be emitted in each direction based on each target detection point on the first sub-model, and the above-mentioned physical rays may pass through the second sub-model.
  • the collision point information to be processed may be information used to describe the position of the point, such as spatial coordinate information. Therefore, according to the collision point to be processed, it can be determined that the spatial coordinate information corresponding to the collision point to be processed is the information of the collision point to be processed.
  • determining the information of the collision point to be processed may be: taking the current target detection point as the center of the sphere, emitting physical rays in any direction in space, and determining the information of the collision point to be processed when each physical ray passes through the second sub-model .
  • the emission of physical rays in each direction from each target detection point on the first sub-model can be regarded as taking the current target detection point as the center of the sphere, and emitting physical rays in each direction of the spherical surface. If the physical ray passes through the second sub-model, the spatial coordinate information of the intersection of the physical ray and the second sub-model is used as the collision point information to be processed.
  • S420 Determine the distance information between the current target detection point and the second sub-model in all directions according to the current target detection point and the information of each collision point to be processed.
  • the distance information between the information of the collision point to be processed and the current target detection point is determined.
  • the calculation formula of the distance between two points in the space can be used to calculate the difference between the information of the collision point to be processed and the spatial coordinate information of the collision point to be processed.
  • the distance information of the current target detection point can be used to calculate the difference between the information of the collision point to be processed and the spatial coordinate information of the collision point to be processed.
  • the distance information corresponding to the collision point to be processed is set as a set value.
  • the physical ray and the first There is no collision point between the two sub-models, and the collision information to be processed at this time can be set as the set value.
  • the set value may be the maximum distance information between the collision point information to be processed and the current target detection point.
  • the distance information between the current target detection point and the second sub-model in each direction is determined.
  • the distance information or setting value corresponding to each physical ray emitted by the current target detection point can be determined, and the distance information or setting value can be used as the current target detection point in all directions with the second Distance information between submodels.
  • the distance function of the spherical distribution of the target detection point can be obtained.
  • the number of sub-functions in this distance function is the same as the amount of distance information of the target detection point in each direction of the sphere.
  • the number of sub-functions can be increased, that is, the density of physical rays can be increased, and the specific number of physical rays can be determined according to actual requirements.
  • the number of basis functions contained in spherical harmonics of different orders is different.
  • the second-order spherical harmonic function contains 4 basis functions
  • the third-order spherical harmonic function contains 9 basis functions
  • the fourth-order spherical harmonic function contains 16 basis functions. function etc.
  • the higher the order of the spherical harmonic function the better the effect of subsequent reconstruction using the reconstruction function.
  • the specific order needs to be set according to actual needs.
  • the order of the spherical harmonic function is determined to be a according to the requirements, then the number of basis functions in the spherical harmonic function can be determined to be a ⁇ 2.
  • the representation of each basis function can be determined according to the relationship between the distance function and the value of the projection coefficient.
  • the same method is used to determine the projection coefficient values of different target detection points.
  • the projection coefficient value of one of the target detection points is determined as an example.
  • the projection coefficient value is a value determined by calculating the distance function using each basis function of the preset spherical harmonic functions.
  • the number of projection coefficient values is the same as the number of basis functions.
  • the distance functions of the spherical distribution of each target detection point are different, and after inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
  • S460 correspondingly store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store at least one image in the target position, so as to obtain from the target position all target detection points in the corresponding order spherical harmonics The projection coefficient value on the function.
  • the target position can be stored in the engine.
  • the engine can be the core component of a programmed editable computer game system or some interactive real-time graphics application.
  • the target location may be a storage space used to store data and/or information in the engine, and in this embodiment, is a storage space used to store coordinate information of the target detection point.
  • multiple images corresponding to the number of coefficient values may be acquired.
  • the spatial coordinate information of the current target detection point determine the storage position of the spatial coordinate in the image, and store the projection coefficient value of the current target detection point in the corresponding storage position in the image, so as to obtain the stored projection from the image of the target position.
  • coefficient values to reconstruct the reconstruction function That is, if the projection coefficient value corresponding to a certain target detection point needs to be used, the projection coefficient value corresponding to the spatial coordinate information can be determined according to the target position of the spatial coordinate information of the target detection point in the engine for subsequent reconstruction. function use.
  • S480 Determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function, so as to obtain the target detection point at the current shooting point Distance information between the first sub-model and the second sub-model under the angle information.
  • the advantage of compressing the distance function based on the spherical harmonic function to obtain the projection coefficient value is:
  • the distance function corresponding to each target detection point can be one or more. In this case, the amount of data to be stored is relatively large. In order to reduce the occupancy rate of the storage space, the projection function can be compressed into several projection coefficient values.
  • the distance function is processed based on the preset spherical harmonic function, the projection coefficient is determined, and the projection coefficient is stored in the vertex color and/or
  • the transparency parameter at the target shooting angle is determined and displayed according to the stored information and the reconstruction function, which solves the problem of transparency caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value.
  • FIG. 5 is a schematic structural diagram of an apparatus for determining transparency according to an embodiment of the present application.
  • the apparatus includes: a relative shooting angle information determination module 510 , a distance information determination module 520 , a transparency information determination module 530 and a transparent display module 540 .
  • the relative shooting angle information determination module 510 is configured to determine the relative shooting angle information corresponding to the shooting device and each target detection point on the first sub-model; the distance information determination module 520 is configured to determine the relative shooting angle information for each target detection point.
  • the target reconstruction function corresponding to the current target detection point, and the relative shooting angle information corresponding to the current target detection point is input into the target reconstruction function to obtain the target detection point under the current shooting angle information.
  • the transparency information determination module 530 is set to determine the waiting point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent display.
  • Transparency information of the collision point; the collision point is the collision point corresponding to the target detection point on the second sub-model; the transparent display module 540 is set to adjust the attributes of each collision point based on the transparency information of the collision point information, so that each point to be collided on the second sub-model is displayed based on the attribute information.
  • the device further includes: a module for determining the target reconstruction function, which is further set to
  • the target reconstruction function is set to process the input current shooting angle, The distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs is obtained.
  • the determining target reconstruction function module is configured to determine the correlation between each target detection point and each target detection point according to the pre-received coefficient values of each basis function in the spherical harmonic function of each target detection point. Before the corresponding target reconstruction function, it is also set to: determine the projection coefficient value of each target detection point on each basis function in the spherical harmonic function; The projection coefficient value on the function includes: for each target detection point on the first sub-model, determining the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point.
  • the target detection point and the information of each collision point to be processed determine the distance value of the current target collision in each direction; according to the distance value of each target detection point in each direction, determine the distance function corresponding to each target detection point;
  • the distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained;
  • the spherical harmonic function includes a plurality of basis functions.
  • the distance information determination module is configured to determine the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point With the information of each collision point to be processed, determine the distance value of the current target collision in all directions, and also set to: take the current target detection point as the center of the sphere, emit physical rays in any direction in space, and record the relationship between each physical ray and the The collision point to be processed when the second sub-model intersects, and the distance value between each collision point to be processed and the target detection point is determined; if there is no collision point to be processed between the physical ray and the second sub-model, it will be The distance values corresponding to the physical rays are marked as set values; based on the distance values and set values corresponding to the physical rays, the distance values of the current target detection point in each direction are determined.
  • the device further includes: a projection coefficient value determination module, configured to: determine the order of the spherical harmonic function, and determine the representation of the basis function and the basis function according to the order. Quantity; for each target detection point, the distance function of the current target detection point is processed based on each basis function to obtain the projection coefficient value of the current target detection point in the direction of the basis function; the number of the projection coefficient values is the same as that of the The number of basis functions is the same.
  • the device further includes: a projection coefficient value storage module, which is further configured to store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store all the projection coefficient values.
  • the at least one image is stored in the target position, so as to obtain the projection coefficient values of all target detection points on the corresponding order spherical harmonic function from the target position.
  • the target reconstruction function determination module is further configured to: retrieve the at least one image from the target position, and determine that the current collision point is in the at least one image the target pixel point position; retrieve the stored projection coefficient values from the target pixel point position in the at least one image respectively; determine the corresponding target detection point based on the projection coefficient values and the spherical harmonic function The objective reconstruction function of .
  • the distance information determination module is further configured to: determine the relative shooting angle information corresponding to the current target detection point and the shooting device; input the relative shooting angle information to the target In the reconstruction function, the distance information corresponding to the current collision point under the relative shooting angle is determined, and the distance information is when the target detection point and the line belonging to the shooting device collide with the second sub-model, the target detection distance information between the point and the collision point of the second submodel.
  • the transparency display module is further configured to: when it is detected that the second sub-model is translucent display, according to the distance information corresponding to each target detection point, according to the pre-established distance information and the corresponding relationship between the transparency information to determine the transparency information of the to-be-collision point corresponding to each target detection point.
  • the transparency display module is further configured to adjust the color depth value of each point to be collided according to the transparency information of each point to be collided, so as to adjust the color depth value of each point to be collided based on the point to be collided on the second sub-model
  • the color depth value is displayed transparently.
  • the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and store it in the vertex color and/or attribute
  • the transparency parameter under the target shooting angle is determined and displayed according to the stored information and the reconstruction function, which solves the transparent display caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value.
  • the apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
  • FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and FIG. 6 shows a block diagram of an exemplary electronic device 60 suitable for implementing the embodiments of the present application.
  • the electronic device 60 shown in FIG. 6 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
  • electronic device 60 takes the form of a general-purpose computing device.
  • Components of electronic device 60 may include, but are not limited to, one or more processors or processing units 601, system memory 602, and a bus 603 connecting different system components (including system memory 602 and processing unit 601).
  • Bus 603 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards Association) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
  • Electronic device 60 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by electronic device 60, including both volatile and non-volatile media, removable and non-removable media.
  • System memory 602 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 604 and/or cache memory 605 .
  • Electronic device 60 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 606 may be configured to read and write to non-removable, non-volatile magnetic media (not shown in Figure 6, commonly referred to as a "hard drive”).
  • a magnetic disk drive configured to read and write to removable non-volatile magnetic disks (eg "floppy disks") and removable non-volatile optical disks (eg CD-ROM, DVD-ROM) may be provided or other optical media) to read and write optical drives.
  • each drive may be connected to bus 603 through one or more data media interfaces.
  • System memory 602 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of each embodiment of the present application.
  • Program modules 607 generally perform the functions and/or methods of the embodiments described herein.
  • the electronic device 60 may also communicate with one or more external devices 609 (eg, keyboards, pointing devices, display 610, etc.), with one or more devices that enable a user to interact with the electronic device 60, and/or with Any device (eg, network card, modem, etc.) that enables the electronic device 60 to communicate with one or more other computing devices. Such communication may take place through input/output (I/O) interface 611 . Also, the electronic device 60 may communicate with one or more networks (eg, Local Area Network (LAN), Wide Area Network (WAN), and/or public networks, such as the Internet) through a network adapter 612. As shown, network adapter 612 communicates with other modules of electronic device 60 via bus 603 . It should be understood that, although not shown in FIG.
  • electronic device 60 may be used in conjunction with electronic device 60, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk arrays (Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems, etc.
  • the processing unit 601 executes each functional application and data processing by running the program stored in the system memory 602, for example, implementing the method for determining transparency provided by the embodiments of the present application.
  • An embodiment of the present application further provides a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform a method of determining transparency.
  • the method includes:
  • the target reconstruction function For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point
  • the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
  • the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
  • the computer storage medium of the embodiments of the present application may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Examples (a non-exhaustive list) of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read- Only Memory, ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the above.
  • a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium, which can transmit, propagate or transport a program arranged for use by or in connection with the instruction execution system, apparatus or device .
  • Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • RF radio frequency
  • Computer program code configured to perform the operations of the embodiments of the present application may be written in one or more programming languages, or combinations thereof, including object-oriented programming languages—such as Java, Smalltalk, C++, and also A conventional procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

Sont divulgués dans les modes de réalisation de la présente demande un procédé et un appareil de détermination de transparence, ainsi qu'un dispositif électronique et un support d'enregistrement. Le procédé comprend : la détermination d'informations d'angle photographique relatif, correspondant à chaque point de détection cible sur un premier sous-modèle, d'un appareil photographique; la détermination d'une fonction de reconstruction cible correspondant au point de détection cible actuel, et l'entrée des informations d'angle photographique relatif correspondant au point de détection cible actuel dans la fonction de reconstruction cible, de façon à obtenir des informations de distance du point de détection cible sous les informations d'angle photographique actuel; lorsqu'il est détecté que le premier sous-modèle est dans un mode d'affichage semi-transparent, la détermination, en fonction d'informations de distance correspondant à chaque point de détection cible, d'informations de transparence d'un point devant être soumis à une collision correspondant à chaque point de détection cible; et le réglage d'informations d'attribut de chaque point sur la base des informations de transparence de chaque point, de telle sorte que chaque point sur un second sous-modèle est soumis à un affichage transparent sur la base des informations d'attribut.
PCT/CN2021/131500 2020-12-08 2021-11-18 Procédé et appareil de détermination de transparence, et dispositif électronique et support d'enregistrement WO2022121654A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011445988.4A CN114627040A (zh) 2020-12-08 2020-12-08 确定透明度的方法、装置、电子设备及存储介质
CN202011445988.4 2020-12-08

Publications (1)

Publication Number Publication Date
WO2022121654A1 true WO2022121654A1 (fr) 2022-06-16

Family

ID=81895496

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/131500 WO2022121654A1 (fr) 2020-12-08 2021-11-18 Procédé et appareil de détermination de transparence, et dispositif électronique et support d'enregistrement

Country Status (2)

Country Link
CN (1) CN114627040A (fr)
WO (1) WO2022121654A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6361438B1 (en) * 1997-07-25 2002-03-26 Konami Co., Ltd. Video game transparency control system for images
US20070018996A1 (en) * 2005-07-25 2007-01-25 Microsoft Corporation Real-time rendering of partially translucent objects
CN106023300A (zh) * 2016-05-09 2016-10-12 深圳市瑞恩宁电子技术有限公司 一种半透明材质的体渲染方法和系统
CN111243075A (zh) * 2020-03-17 2020-06-05 广东趣炫网络股份有限公司 一种面向手游的生成水深度图的方法、装置和设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6361438B1 (en) * 1997-07-25 2002-03-26 Konami Co., Ltd. Video game transparency control system for images
US20070018996A1 (en) * 2005-07-25 2007-01-25 Microsoft Corporation Real-time rendering of partially translucent objects
CN106023300A (zh) * 2016-05-09 2016-10-12 深圳市瑞恩宁电子技术有限公司 一种半透明材质的体渲染方法和系统
CN111243075A (zh) * 2020-03-17 2020-06-05 广东趣炫网络股份有限公司 一种面向手游的生成水深度图的方法、装置和设备

Also Published As

Publication number Publication date
CN114627040A (zh) 2022-06-14

Similar Documents

Publication Publication Date Title
CN111815755A (zh) 虚拟物体被遮挡的区域确定方法、装置及终端设备
US11263803B2 (en) Virtual reality scene rendering method, apparatus and device
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
WO2021228031A1 (fr) Procédé, appareil et système de rendu
CN111882634B (zh) 一种图像渲染方法、装置、设备及存储介质
WO2020248900A1 (fr) Procédé et appareil de traitement de vidéo panoramique et support de stockage
US11727632B2 (en) Shader binding management in ray tracing
CN112840378A (zh) 在路径追踪中使用共享光照贡献进行相互作用的全局照明
WO2022089592A1 (fr) Procédé de rendu d'éléments graphiques et dispositif associé
WO2022121653A1 (fr) Procédé et appareil de détermination de transparence, dispositif électronique et support d'enregistrement
US11954830B2 (en) High dynamic range support for legacy applications
CN110930497A (zh) 一种全局光照相交加速方法、装置及计算机存储介质
US11475549B1 (en) High dynamic range image generation from tone mapped standard dynamic range images
EP3956752B1 (fr) Expérience de réalité artificielle augmentée sémantique
WO2022121654A1 (fr) Procédé et appareil de détermination de transparence, et dispositif électronique et support d'enregistrement
US20230267063A1 (en) Real-time latency measurements in streaming systems and applications
CN112528707A (zh) 图像处理方法、装置、设备及存储介质
WO2022121652A1 (fr) Procédé et appareil de détermination de transparence, dispositif électronique et support d'enregistrement
US20230281906A1 (en) Motion vector optimization for multiple refractive and reflective interfaces
Fu et al. Dynamic shadow rendering with shadow volume optimization
CN115715464A (zh) 用于遮挡处理技术的方法和装置
CN114612603A (zh) 确定透明度的方法、装置、电子设备及存储介质
CN112465692A (zh) 图像处理方法、装置、设备及存储介质
CN114627231A (zh) 确定透明度的方法、装置、电子设备及存储介质
US11823318B2 (en) Techniques for interleaving textures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21902361

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.11.2023)