WO2022121654A1 - Transparency determination method and apparatus, and electronic device and storage medium - Google Patents

Transparency determination method and apparatus, and electronic device and storage medium Download PDF

Info

Publication number
WO2022121654A1
WO2022121654A1 PCT/CN2021/131500 CN2021131500W WO2022121654A1 WO 2022121654 A1 WO2022121654 A1 WO 2022121654A1 CN 2021131500 W CN2021131500 W CN 2021131500W WO 2022121654 A1 WO2022121654 A1 WO 2022121654A1
Authority
WO
WIPO (PCT)
Prior art keywords
target detection
detection point
point
sub
model
Prior art date
Application number
PCT/CN2021/131500
Other languages
French (fr)
Chinese (zh)
Inventor
冯乐乐
Original Assignee
上海米哈游天命科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海米哈游天命科技有限公司 filed Critical 上海米哈游天命科技有限公司
Publication of WO2022121654A1 publication Critical patent/WO2022121654A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the embodiments of the present application relate to the technical field of games, for example, to a method, apparatus, electronic device, and storage medium for determining transparency.
  • the translucent effect between the inner model and the outer model is usually set, for example, the translucent display of the skin model and the clothes model.
  • the translucent display mainly depends on the effect displayed after the light reflected by the inner layer object penetrates the outer layer object after a certain distance and is incident on the human eye.
  • each model is composed of points one by one. When the distance between the point on the inner layer object and the corresponding point on the outer layer object is farther, the translucent effect is weaker, and vice versa, the translucent effect is relatively strong.
  • determining the translucent effect is mainly to set the transparency display value of the outer layer model.
  • the transparency display value is usually fixed, and all points on the outer layer model realize the transparent display according to the set transparency display value.
  • the transparent display effect produced by this method has a certain deviation from the actual situation, the transparent display effect is not good, and the user experience is poor.
  • Embodiments of the present application provide a method, apparatus, electronic device, and storage medium for determining transparency, so that the transparent display effect is consistent with reality and user experience is improved.
  • an embodiment of the present application provides a method for determining transparency, the method comprising:
  • the target reconstruction function For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point
  • the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
  • the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is displayed based on the attribute information.
  • the embodiments of the present application also provide a device for determining transparency, including:
  • a relative shooting angle information determination module configured to determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model
  • the distance information determination module is configured to determine the target reconstruction function corresponding to the current target detection point for each target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function , to obtain the distance information between the first sub-model and the second sub-model of the target detection point under the current shooting angle information;
  • the first sub-model is a sub-model that wraps a part of the second sub-model;
  • reconstruction The function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function;
  • the transparency information determination module is configured to determine the transparency information of the to-be-collision point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent;
  • the collision point is the collision point corresponding to the target detection point on the second sub-model;
  • the transparent display module is configured to adjust the attribute information of each to-be-collision point based on the transparency information of each to-be-collision point, so that each to-be-collision point on the second sub-model is displayed based on the attribute information.
  • an embodiment of the present application also provides an electronic device, the electronic device comprising:
  • processors one or more processors
  • storage means arranged to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining transparency described in any one of the embodiments of the present application.
  • the embodiments of the present application further provide a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform the determination described in any one of the embodiments of the present application method of transparency.
  • FIG. 1 is a schematic flowchart of a method for determining transparency according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a device for determining transparency according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1 is a schematic flowchart of a method for determining transparency provided by an embodiment of the present application.
  • This embodiment can be applied to determine a target detection point corresponding to each target detection point according to a pre-stored projection coefficient value.
  • the target reconstruction function is to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point at the corresponding visual angle according to the target reconstruction function of each target detection point, and then based on the situation of transparent display of the distance information,
  • the method may be performed by an apparatus for determining transparency, the apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, for example, the electronic device may be a mobile terminal or the like.
  • the first sub-model and the second sub-model are relative.
  • the sub-models can be a skin model and a clothes model, and the model corresponding to the clothes can be used as the first sub-model. model as the second submodel.
  • the target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; of course, it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point.
  • the photographing device is a device for observing and photographing the first sub-model
  • the relative photographing angle is the relative angle between the photographing device and each target detection point on the first sub-model.
  • the relative shooting angle information is determined according to the position information between the shooting device and each target detection point.
  • the relative position of the shooting device and each target detection point can be determined.
  • angle information, and the angle information can be used as relative shooting angle information.
  • the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function.
  • Each object detection point has a corresponding object reconstruction function.
  • the reconstruction function of reconstructing one of the target detection points may be used as an example for introduction.
  • the reconstruction function of the target detection point is to process the projection coefficient of the current target detection point according to the spherical harmonic function, so as to obtain when the current target detection point emits physical rays in each direction in space, when the physical rays collide with the second sub-model
  • the function constructed from the corresponding distance value.
  • the reconstruction function can be used to process the input relative shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point under the relative shooting angle.
  • the projection coefficient value corresponding to the current target detection point based on each basis function in the preset spherical harmonic functions, it can be obtained that when the current target detection point emits physical rays in each direction in space, the physical ray and the first The distance information between the collision point and the current target detection point when the two sub-models collide. According to the distance information of each direction in the space corresponding to the current target detection point, a reconstruction function corresponding to the current target detection point is constructed.
  • the target shooting angle can be input into the reconstruction function corresponding to the current target detection point.
  • the reconstruction function can process the target shooting angle and output the target shooting angle. The distance information between the collision point and the current target detection point when the current target detection point and the line belonging to the target photographing device collide with the second sub-model under the angle.
  • the relative shooting angle can be input to the target detection point A.
  • the distance information before the first sub-model and the second sub-model corresponding to the target detection point A at this angle can be obtained.
  • the obtained distance information is 5 nm.
  • the translucent display can be understood as the material used in the first sub-model is a translucent material.
  • the distance information corresponding to each target detection point may be the same or different.
  • the distance information corresponding to each target detection point refers to the distance between the target detection point and the point to be collided when the line to which the photographing device and the target detection point belong collide with the second sub-model.
  • Transparency information is used to indicate the degree of transparency when the model is displayed, which can be expressed as a percentage, for example: the transparency is 80% and so on.
  • the same method is used to determine the transparency parameters of different target detection points.
  • the transparency parameters of one of the target detection points are determined as an example to introduce.
  • determining the transparency parameter of the current target detection point may be: according to the distance information, between the pre-stored distance information and the transparency parameter
  • the transparency parameter is determined in the corresponding relationship between the two, or it can be calculated according to a preset transparency parameter calculation model, and the distance information is input into the transparency parameter calculation model, and the transparency parameter corresponding to the distance information can be obtained through calculation.
  • f is a monotonic and monotonically decreasing function.
  • the target detection points on the first sub-model can be displayed according to the corresponding transparency parameters, and the The intersection of the target detection point and the line to which the target photographing device belongs and the second sub-model is displayed according to the corresponding transparency parameter, so as to obtain the effect of transparent display.
  • the transparency parameter of each target detection point is determined, and the effect of transparently displaying the first sub-model and the second sub-model can be realized.
  • the transparency parameter corresponding to the distance information may be determined in the pre-stored correspondence between each distance information and the transparency parameter, which may include the transparency parameter of the first sub-model and The transparency parameter of the second submodel for subsequent transparent display.
  • the to-be-to-be-detected points corresponding to each target detection point on the first sub-model can be adjusted based on the above-mentioned transparency parameters. Attribute information of the collision point.
  • the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point, including: when the first sub-model is detected When the two sub-models are displayed semi-transparently, according to the distance information corresponding to each target detection point, and according to the corresponding relationship between the pre-established distance information and transparency information, determine the transparency information of the to-be-collision point corresponding to each target detection point .
  • the corresponding relationship between the distance information and the transparency parameter may be established in advance, and the transparency parameter of each target detection point may be determined according to the corresponding relationship. For example, when the distance information is 5mm, the transparency parameter is determined to be 0.5 according to the corresponding relationship; when the distance information is 10mm, the transparency parameter is determined to be 0.2 according to the corresponding relationship.
  • the transparency information corresponding to each target detection point can be determined according to the corresponding relationship between the distance information and the transparency parameter.
  • the point to be collided is a point on the second sub-model.
  • the point to be collided can be obtained by colliding with the second sub-model. That is, the point to be collided is a point corresponding to each target detection point on the second sub-model under the relative shooting angle of each target detection point.
  • the attribute information can be the color attribute of the point to be collided.
  • the color attribute of the second collision point corresponding to the target detection point can be adjusted according to the transparency information. For example, the greater the distance information, the darker the color of the point to be collided, and vice versa. In this way, when the outer layer is semi-transparently displayed, the color depth of the to-be-collision point corresponding to each target detection point can be adjusted, thereby realizing transparent display.
  • the distance information between the first sub-model and the second sub-model corresponding to the target detection point is obtained by determining the target shooting angle and retrieving the projection coefficient value from the vertex color and/or attribute information.
  • the transparency parameter and display it which solves the technical problem of poor transparent display effect and poor user experience caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value. Adjust the transparency parameter according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
  • the attribute information of each point to be collided is adjusted based on the transparency information of each point to be collided, so that each point to be collided on the second sub-model is displayed based on the attribute information
  • the method includes: adjusting the color depth value of each point to be collided according to the transparency information of each point to be collided, so as to perform transparent display based on the color depth value of each point to be collided on the second sub-model.
  • the attribute information may be a color depth value.
  • the color depth of the second collision point corresponding to each target detection point can be adjusted according to the transparency parameter corresponding to each target detection point. For example, the closer the distance, the color depth. The larger the value, the lighter the color of the point to be collided with.
  • the color attributes corresponding to different transparency parameters can be set according to the actual situation. After the transparency parameter is determined, the color depth value of each point to be collided can be adjusted according to the color depth value corresponding to the transparency parameter.
  • the attribute information of the first sub-model and the second sub-model can be obtained, combining the attribute information and the distance information,
  • the transparency information corresponding to each target detection point is determined comprehensively, so that the first sub-model and the second sub-model are transparently displayed.
  • FIG. 2 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application.
  • a reconstruction function can be reconstructed according to the spherical harmonic function and the projection coefficient value, so as to determine each target based on the reconstruction function
  • the specific implementation can refer to the following specific description. Wherein, the explanations of terms that are the same as or corresponding to the above embodiments are not repeated here.
  • the method for determining transparency includes:
  • S210 Determine a target reconstruction function corresponding to each target detection point according to the pre-received projection coefficient values of each base function in the spherical harmonic function of each target detection point.
  • the target reconstruction function is used to process the input current shooting angle to obtain distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs.
  • Each object detection point has its corresponding object reconstruction function.
  • ray is emitted at each target detection point, physical ray can be emitted to each direction in space, and the distance information between each physical ray and the first sub-model and the second sub-model can be determined.
  • the distance information is processed, and all distances can be compressed into several function values, which can be used as projection coefficient values.
  • the spherical function corresponding to the target detection point can be reconstructed, so that the relative shooting angle can be input into the spherical function, that is, the reconstruction function, and the angle can be obtained. distance information below.
  • the pre-stored projection coefficient value corresponding to each target detection point can be obtained, and the projection coefficient value can be processed based on the spherical harmonic function, and the target reconstruction function corresponding to each target detection point can be reconstructed, and the target reconstruction function can be A function of determining the distance information between the first sub-model and the second sub-model at each relative shooting angle.
  • S220 Determine relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model.
  • each target detection point determines a target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target
  • the detection point is the distance information between the first sub-model and the second sub-model under the current shooting angle information.
  • the projection coefficient values corresponding to each target detection point can be reconstructed according to the spherical harmonic function to obtain the reconstruction function of each target detection point, and the relative shooting angle corresponding to each target detection point is input
  • the distance information between the first sub-model and the second sub-model corresponding to each target detection point can be obtained, and the transparency parameter information can be further determined in combination with the distance information, and then determined according to the distance information and each target detection point.
  • the corresponding color attribute dynamically adjusts the color attribute of each point to be collided on the second sub-model, so that the model is transparently displayed.
  • FIG. 3 is a schematic flowchart of a method for determining transparency according to another embodiment of the present application.
  • the method further includes: determining the projection coefficient of each target detection point value.
  • the method includes:
  • determining the projection coefficient value of each target detection point may be: for each target detection point on the first sub-model, determining the current target detection point emits physical rays in all directions when it passes through the second sub-model.
  • the distance values of the current target collision in each direction are determined according to the current target detection point and the information of each collision point to be processed.
  • a distance function corresponding to each target detection point is determined.
  • the distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained; the spherical harmonic function includes a plurality of basis functions.
  • the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model.
  • the target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; of course, it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point.
  • the distance function corresponds to each target detection point, that is, a target detection point corresponds to a distance function.
  • a target detection point corresponds to a distance function.
  • each distance function it can be understood as: it can be determined according to the relative distance information between a certain target detection point and the second sub-model in each direction in space.
  • the distance between each target detection point and the second sub-model in each direction can be: take the target detection point as the center of the sphere, emit physical rays in each direction in space, and determine each Distance information between the intersection of the physical ray and the second sub-model and the target detection point.
  • each distance function can be determined based on the distance between a certain target detection point and the second sub-model in each direction. distance function.
  • the same method is used to determine the spherical distribution distance function of different target detection points.
  • the spherical distribution function of one of the target detection points is determined as an example to introduce.
  • a target detection point on the first sub-model can be used as the center of the sphere, physical rays can be emitted in every direction in space, and the distance information between the intersection of each physical ray and the second sub-model and the target detection point can be determined. If the physical ray has an intersection with the second sub-model in the direction of the ray, the distance between the intersection and the target detection point is used as the distance information between the target detection point and the second sub-model in the direction. If the spherical ray does not have an intersection with the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in this direction.
  • the distance function of the spherical distribution of the target detection point can be determined.
  • the spherical distribution distance function of the target detection point A is:
  • i represents the ith direction
  • F(i) represents the distance information in the ith direction
  • dist_i represents the specific distance information
  • n represents the total number of directions.
  • the distance function of the spherical distribution of each target detection point is a composite function
  • the number of sub-functions in the composite function can be determined according to the preset number of samples.
  • the default can be 16 ⁇ 32 precision, that is, The composite function contains 512 sub-functions.
  • the specific sampling quantity can be determined according to actual needs.
  • the same method is used to determine the projection coefficient values of different target detection points.
  • the projection coefficient value of one of the target detection points is determined as an example.
  • the spherical harmonic function is composed of multiple basis functions, and each distance function can be processed by using the basis function. Different orders of spherical harmonics correspond to different numbers of basis functions.
  • the second-order spherical harmonics include 4 basis functions, the third-order spherical harmonics include 9 basis functions, and the fourth-order spherical harmonics include 16 basis functions.
  • the projection coefficient value is the coefficient value obtained by processing the distance function according to the basis function in the spherical harmonic function, and the number of coefficient values is the same as that of the basis function. According to the spherical harmonic function and its corresponding multiple basis functions, the spherical distribution distance function of the target detection point can be compressed to obtain the projection coefficient value.
  • the spherical harmonic function is second-order, that is, it includes 4 basis functions, then inputting the distance function corresponding to the target detection point into the spherical harmonic function can obtain 4 projection coefficient values, so as to compress the distance function into 4 projection coefficient values.
  • the higher the order of the spherical harmonic function the higher the similarity between the reconstructed sphere and the actual sphere during reconstruction, so developers can choose spherical harmonic functions of different orders according to actual needs.
  • the higher-order spherical harmonic function contains more basis functions, and the degree of restoration of the distance function is higher in the subsequent restoration of the distance function according to the spherical harmonic function and the projection coefficient value.
  • the distance function of the spherical distribution of the target detection point can be input into the spherical harmonic function, and the distance function is processed based on the basis function in the spherical harmonic function to obtain the above distance function.
  • the projection coefficient values on each basis function It follows that the number of projection coefficient values is equal to the number of basis functions in spherical harmonics.
  • the distance functions of the spherical distribution of different target detection points can be the same or different.
  • the distance functions corresponding to different target detection points are the same.
  • the distance functions corresponding to different target detection points are different, and whether the distance function is the same is based on the target detection points in each direction in space. The distance information on it is determined. After inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
  • the method further includes: storing the projection coefficient value in at least one image, so as to obtain the corresponding target detection point from the at least one image. Projection coefficient value, so as to reconstruct the corresponding objective reconstruction function.
  • the projection coefficient value of each target detection point is correspondingly stored in at least one image according to the spatial position relationship, and the at least one image is stored in the target position, so as to obtain all target detection points in the target position from the target position.
  • the value of the projection coefficient on the corresponding spherical harmonic function is correspondingly stored in at least one image according to the spatial position relationship, and the at least one image is stored in the target position, so as to obtain all target detection points in the target position from the target position.
  • the method of storing the projection coefficient values of different target detection points is the same, and it can be introduced by taking the storage of the projection coefficient value corresponding to a certain target detection point as an example.
  • the number of basis functions can be four, and after the distance function is processed by the basis functions in the spherical harmonic function, four projection coefficient values can be obtained.
  • the projection coefficient values can be stored in the image according to the spatial coordinate position information. For example, if there are four projection coefficient values, four images can be used to store the projection coefficient values, and each pixel in each image stores a projection coefficient value. At least one image may be stored in the engine, and the location stored in the engine may be used as the target location.
  • the advantage of storing the image to the target location is that the stored projection coefficient values can be obtained from the target location so that a reconstruction function can be constructed from the projection coefficient values.
  • the number of images may be determined according to the number of projection coefficient values, and the projection coefficient values may be stored in corresponding pixel points in the image according to the spatial coordinate information.
  • each target detection point has 9 projection coefficient values.
  • nine images can be determined, and the nine projection coefficient values of each target detection point can be stored in the corresponding pixel points in each image according to the spatial coordinate information.
  • the projection coefficient value can also be stored in the vertex color.
  • Each target detection point on the first sub-model can be used as a vertex color, each target detection point corresponds to a pixel channel, that is, four RGBA channels, and the projection coefficient value of the target detection point can be stored in each channel.
  • the attribute information can be scalable information corresponding to each target detection point, for example: UV, ie u, v texture map coordinates.
  • the vertex color corresponding to the target detection point and the attribute information of the current target detection point can be used together to store the projection coefficient value.
  • the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point
  • four projection coefficient values can be stored respectively according to the four channels of R, G, B and A, and , the number of vertex colors required can be determined according to the number of projection coefficient values. It is also possible to store 4 projection coefficient values in one vertex color, and store the remaining projection coefficient values in the UV coordinates corresponding to the current target detection point. The above method can reduce the use of vertex colors and facilitate the calculation of projection coefficient values. storage and subsequent recall and use.
  • the vertex color corresponding to the target detection point and the UV coordinate corresponding to the current target detection point can also be used together to store the projection coefficient value. For example, if 9 projection coefficient values are stored, two vertex colors can be used to store 8 projection coefficient value, and store the remaining 1 projection coefficient value in the UV coordinate corresponding to the current target detection point.
  • the projection coefficient value After the projection coefficient value is stored, it can be imported into the target position in the engine according to the index coordinates of the current target detection point for storage, so that the projection coefficient value corresponding to the current target detection point can be retrieved from the engine according to the target detection point.
  • the target distance information reconstructed by the coefficient value determines the transparency parameter of each target detection point, and then displays the first sub-model and the second sub-model based on each transparency parameter.
  • For each target detection point retrieve at least one image from the target position, determine the target pixel position of the current target detection point in the at least one image, and obtain the current target detection point according to the retrieved projection coefficient values The target reconstruction function corresponding to the point.
  • each target detection point on the first sub-model needs to be displayed transparently, the distance information of each target detection point at the relative shooting angle needs to be determined, and then the transparency parameter is determined according to the distance information.
  • the reconstruction function corresponding to each target detection point can be reconstructed according to the projection coefficient value stored at the target position, so that the relative shooting angle can be input into the corresponding reconstruction function , to obtain the distance information corresponding to each target detection point.
  • the spatial coordinate information of the current target detection point can be obtained, and the target pixel position of the current target detection point in each image can be determined according to the spatial coordinate information.
  • the pixel position gets the stored projection coefficient value. Reconstruct each stored projection coefficient value according to the spherical harmonic function, and obtain when the current target detection point emits physical rays in each direction in the space, when the current target detection point corresponding to each physical ray collides with the second sub-model, the current target detection point
  • the distance information between the detection point and the point to be collided, and the distance function obtained according to each distance information, that is, the target reconstruction function corresponding to each target detection point is the distance function.
  • the projection coefficient value corresponding to each target detection point is pre-determined, and the projection coefficient value is stored in a preset number of maps, so that when the transparent display is performed, it can be retrieved from the corresponding position
  • the stored projection coefficient value and then reconstruct a reconstruction function based on the projection coefficient value, so as to determine the distance information of each target detection point under different relative shooting angles based on the reconstruction function.
  • FIG. 4 is a schematic flowchart of a method for determining transparency according to another embodiment of the present application.
  • the foregoing embodiment for the specific determination method of determining the distance function, reference may be made to the technical solution of this embodiment. The explanations of terms that are the same as or corresponding to the above embodiments are not repeated here.
  • the method includes:
  • the information of each collision point to be processed may include position information of each collision point to be collided, such as spatial coordinate information and the like. For example, take the current target detection point as the center of the sphere, emit physical rays in any direction in space, record the collision points to be processed when each physical ray intersects the second sub-model, and determine the collision points to be processed and the target detection point. Distance value between points.
  • physical rays may be emitted in each direction based on each target detection point on the first sub-model, and the above-mentioned physical rays may pass through the second sub-model.
  • the collision point information to be processed may be information used to describe the position of the point, such as spatial coordinate information. Therefore, according to the collision point to be processed, it can be determined that the spatial coordinate information corresponding to the collision point to be processed is the information of the collision point to be processed.
  • determining the information of the collision point to be processed may be: taking the current target detection point as the center of the sphere, emitting physical rays in any direction in space, and determining the information of the collision point to be processed when each physical ray passes through the second sub-model .
  • the emission of physical rays in each direction from each target detection point on the first sub-model can be regarded as taking the current target detection point as the center of the sphere, and emitting physical rays in each direction of the spherical surface. If the physical ray passes through the second sub-model, the spatial coordinate information of the intersection of the physical ray and the second sub-model is used as the collision point information to be processed.
  • S420 Determine the distance information between the current target detection point and the second sub-model in all directions according to the current target detection point and the information of each collision point to be processed.
  • the distance information between the information of the collision point to be processed and the current target detection point is determined.
  • the calculation formula of the distance between two points in the space can be used to calculate the difference between the information of the collision point to be processed and the spatial coordinate information of the collision point to be processed.
  • the distance information of the current target detection point can be used to calculate the difference between the information of the collision point to be processed and the spatial coordinate information of the collision point to be processed.
  • the distance information corresponding to the collision point to be processed is set as a set value.
  • the physical ray and the first There is no collision point between the two sub-models, and the collision information to be processed at this time can be set as the set value.
  • the set value may be the maximum distance information between the collision point information to be processed and the current target detection point.
  • the distance information between the current target detection point and the second sub-model in each direction is determined.
  • the distance information or setting value corresponding to each physical ray emitted by the current target detection point can be determined, and the distance information or setting value can be used as the current target detection point in all directions with the second Distance information between submodels.
  • the distance function of the spherical distribution of the target detection point can be obtained.
  • the number of sub-functions in this distance function is the same as the amount of distance information of the target detection point in each direction of the sphere.
  • the number of sub-functions can be increased, that is, the density of physical rays can be increased, and the specific number of physical rays can be determined according to actual requirements.
  • the number of basis functions contained in spherical harmonics of different orders is different.
  • the second-order spherical harmonic function contains 4 basis functions
  • the third-order spherical harmonic function contains 9 basis functions
  • the fourth-order spherical harmonic function contains 16 basis functions. function etc.
  • the higher the order of the spherical harmonic function the better the effect of subsequent reconstruction using the reconstruction function.
  • the specific order needs to be set according to actual needs.
  • the order of the spherical harmonic function is determined to be a according to the requirements, then the number of basis functions in the spherical harmonic function can be determined to be a ⁇ 2.
  • the representation of each basis function can be determined according to the relationship between the distance function and the value of the projection coefficient.
  • the same method is used to determine the projection coefficient values of different target detection points.
  • the projection coefficient value of one of the target detection points is determined as an example.
  • the projection coefficient value is a value determined by calculating the distance function using each basis function of the preset spherical harmonic functions.
  • the number of projection coefficient values is the same as the number of basis functions.
  • the distance functions of the spherical distribution of each target detection point are different, and after inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
  • S460 correspondingly store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store at least one image in the target position, so as to obtain from the target position all target detection points in the corresponding order spherical harmonics The projection coefficient value on the function.
  • the target position can be stored in the engine.
  • the engine can be the core component of a programmed editable computer game system or some interactive real-time graphics application.
  • the target location may be a storage space used to store data and/or information in the engine, and in this embodiment, is a storage space used to store coordinate information of the target detection point.
  • multiple images corresponding to the number of coefficient values may be acquired.
  • the spatial coordinate information of the current target detection point determine the storage position of the spatial coordinate in the image, and store the projection coefficient value of the current target detection point in the corresponding storage position in the image, so as to obtain the stored projection from the image of the target position.
  • coefficient values to reconstruct the reconstruction function That is, if the projection coefficient value corresponding to a certain target detection point needs to be used, the projection coefficient value corresponding to the spatial coordinate information can be determined according to the target position of the spatial coordinate information of the target detection point in the engine for subsequent reconstruction. function use.
  • S480 Determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function, so as to obtain the target detection point at the current shooting point Distance information between the first sub-model and the second sub-model under the angle information.
  • the advantage of compressing the distance function based on the spherical harmonic function to obtain the projection coefficient value is:
  • the distance function corresponding to each target detection point can be one or more. In this case, the amount of data to be stored is relatively large. In order to reduce the occupancy rate of the storage space, the projection function can be compressed into several projection coefficient values.
  • the distance function is processed based on the preset spherical harmonic function, the projection coefficient is determined, and the projection coefficient is stored in the vertex color and/or
  • the transparency parameter at the target shooting angle is determined and displayed according to the stored information and the reconstruction function, which solves the problem of transparency caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value.
  • FIG. 5 is a schematic structural diagram of an apparatus for determining transparency according to an embodiment of the present application.
  • the apparatus includes: a relative shooting angle information determination module 510 , a distance information determination module 520 , a transparency information determination module 530 and a transparent display module 540 .
  • the relative shooting angle information determination module 510 is configured to determine the relative shooting angle information corresponding to the shooting device and each target detection point on the first sub-model; the distance information determination module 520 is configured to determine the relative shooting angle information for each target detection point.
  • the target reconstruction function corresponding to the current target detection point, and the relative shooting angle information corresponding to the current target detection point is input into the target reconstruction function to obtain the target detection point under the current shooting angle information.
  • the transparency information determination module 530 is set to determine the waiting point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent display.
  • Transparency information of the collision point; the collision point is the collision point corresponding to the target detection point on the second sub-model; the transparent display module 540 is set to adjust the attributes of each collision point based on the transparency information of the collision point information, so that each point to be collided on the second sub-model is displayed based on the attribute information.
  • the device further includes: a module for determining the target reconstruction function, which is further set to
  • the target reconstruction function is set to process the input current shooting angle, The distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs is obtained.
  • the determining target reconstruction function module is configured to determine the correlation between each target detection point and each target detection point according to the pre-received coefficient values of each basis function in the spherical harmonic function of each target detection point. Before the corresponding target reconstruction function, it is also set to: determine the projection coefficient value of each target detection point on each basis function in the spherical harmonic function; The projection coefficient value on the function includes: for each target detection point on the first sub-model, determining the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point.
  • the target detection point and the information of each collision point to be processed determine the distance value of the current target collision in each direction; according to the distance value of each target detection point in each direction, determine the distance function corresponding to each target detection point;
  • the distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained;
  • the spherical harmonic function includes a plurality of basis functions.
  • the distance information determination module is configured to determine the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point With the information of each collision point to be processed, determine the distance value of the current target collision in all directions, and also set to: take the current target detection point as the center of the sphere, emit physical rays in any direction in space, and record the relationship between each physical ray and the The collision point to be processed when the second sub-model intersects, and the distance value between each collision point to be processed and the target detection point is determined; if there is no collision point to be processed between the physical ray and the second sub-model, it will be The distance values corresponding to the physical rays are marked as set values; based on the distance values and set values corresponding to the physical rays, the distance values of the current target detection point in each direction are determined.
  • the device further includes: a projection coefficient value determination module, configured to: determine the order of the spherical harmonic function, and determine the representation of the basis function and the basis function according to the order. Quantity; for each target detection point, the distance function of the current target detection point is processed based on each basis function to obtain the projection coefficient value of the current target detection point in the direction of the basis function; the number of the projection coefficient values is the same as that of the The number of basis functions is the same.
  • the device further includes: a projection coefficient value storage module, which is further configured to store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store all the projection coefficient values.
  • the at least one image is stored in the target position, so as to obtain the projection coefficient values of all target detection points on the corresponding order spherical harmonic function from the target position.
  • the target reconstruction function determination module is further configured to: retrieve the at least one image from the target position, and determine that the current collision point is in the at least one image the target pixel point position; retrieve the stored projection coefficient values from the target pixel point position in the at least one image respectively; determine the corresponding target detection point based on the projection coefficient values and the spherical harmonic function The objective reconstruction function of .
  • the distance information determination module is further configured to: determine the relative shooting angle information corresponding to the current target detection point and the shooting device; input the relative shooting angle information to the target In the reconstruction function, the distance information corresponding to the current collision point under the relative shooting angle is determined, and the distance information is when the target detection point and the line belonging to the shooting device collide with the second sub-model, the target detection distance information between the point and the collision point of the second submodel.
  • the transparency display module is further configured to: when it is detected that the second sub-model is translucent display, according to the distance information corresponding to each target detection point, according to the pre-established distance information and the corresponding relationship between the transparency information to determine the transparency information of the to-be-collision point corresponding to each target detection point.
  • the transparency display module is further configured to adjust the color depth value of each point to be collided according to the transparency information of each point to be collided, so as to adjust the color depth value of each point to be collided based on the point to be collided on the second sub-model
  • the color depth value is displayed transparently.
  • the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and store it in the vertex color and/or attribute
  • the transparency parameter under the target shooting angle is determined and displayed according to the stored information and the reconstruction function, which solves the transparent display caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value.
  • the apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
  • FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and FIG. 6 shows a block diagram of an exemplary electronic device 60 suitable for implementing the embodiments of the present application.
  • the electronic device 60 shown in FIG. 6 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
  • electronic device 60 takes the form of a general-purpose computing device.
  • Components of electronic device 60 may include, but are not limited to, one or more processors or processing units 601, system memory 602, and a bus 603 connecting different system components (including system memory 602 and processing unit 601).
  • Bus 603 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards Association) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
  • Electronic device 60 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by electronic device 60, including both volatile and non-volatile media, removable and non-removable media.
  • System memory 602 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 604 and/or cache memory 605 .
  • Electronic device 60 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 606 may be configured to read and write to non-removable, non-volatile magnetic media (not shown in Figure 6, commonly referred to as a "hard drive”).
  • a magnetic disk drive configured to read and write to removable non-volatile magnetic disks (eg "floppy disks") and removable non-volatile optical disks (eg CD-ROM, DVD-ROM) may be provided or other optical media) to read and write optical drives.
  • each drive may be connected to bus 603 through one or more data media interfaces.
  • System memory 602 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of each embodiment of the present application.
  • Program modules 607 generally perform the functions and/or methods of the embodiments described herein.
  • the electronic device 60 may also communicate with one or more external devices 609 (eg, keyboards, pointing devices, display 610, etc.), with one or more devices that enable a user to interact with the electronic device 60, and/or with Any device (eg, network card, modem, etc.) that enables the electronic device 60 to communicate with one or more other computing devices. Such communication may take place through input/output (I/O) interface 611 . Also, the electronic device 60 may communicate with one or more networks (eg, Local Area Network (LAN), Wide Area Network (WAN), and/or public networks, such as the Internet) through a network adapter 612. As shown, network adapter 612 communicates with other modules of electronic device 60 via bus 603 . It should be understood that, although not shown in FIG.
  • electronic device 60 may be used in conjunction with electronic device 60, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk arrays (Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems, etc.
  • the processing unit 601 executes each functional application and data processing by running the program stored in the system memory 602, for example, implementing the method for determining transparency provided by the embodiments of the present application.
  • An embodiment of the present application further provides a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform a method of determining transparency.
  • the method includes:
  • the target reconstruction function For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point
  • the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
  • the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
  • the computer storage medium of the embodiments of the present application may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Examples (a non-exhaustive list) of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read- Only Memory, ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the above.
  • a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium, which can transmit, propagate or transport a program arranged for use by or in connection with the instruction execution system, apparatus or device .
  • Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • RF radio frequency
  • Computer program code configured to perform the operations of the embodiments of the present application may be written in one or more programming languages, or combinations thereof, including object-oriented programming languages—such as Java, Smalltalk, C++, and also A conventional procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network

Abstract

Provided in the embodiments of the present application are a transparency determination method and apparatus, and an electronic device and a storage medium. The method comprises: determining relative photographic angle information, corresponding to each target detection point on a first sub-model, of a photographic apparatus; determining a target reconstruction function corresponding to the current target detection point, and inputting the relative photographic angle information corresponding to the current target detection point into the target reconstruction function, so as to obtain distance information of the target detection point under the current photographic angle information; when it is detected that the first sub-model is in a semi-transparent display mode, determining, according to distance information corresponding to each target detection point, transparency information of a point to be subjected to a collision corresponding to each target detection point; and adjusting attribute information of each said point on the basis of the transparency information of each said point, such that each said point on a second sub-model is subjected to transparent display on the basis of the attribute information.

Description

确定透明度的方法、装置、电子设备及存储介质Method, apparatus, electronic device and storage medium for determining transparency
本申请要求在2020年12月08日提交中国专利局、申请号为202011445988.4的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application with application number 202011445988.4 filed with the China Patent Office on December 08, 2020, the entire contents of which are incorporated herein by reference.
技术领域technical field
本申请实施例涉及游戏技术领域,例如涉及一种确定透明度的方法、装置、电子设备及存储介质。The embodiments of the present application relate to the technical field of games, for example, to a method, apparatus, electronic device, and storage medium for determining transparency.
背景技术Background technique
在动画设计中,通常会设置内层模型与外层模型之间的半透明效果,例如,皮肤模型和衣服模型的半透明显示。相应的,半透明显示主要是依赖内层物体所反射的光经过一定距离后穿透外层物体并入射到人眼后所显示的效果。其中,各模型是由一个个点构成的,当内层物体上的点与外层物体上与其相对应的点的距离越远,半透明效果比较弱,反之半透明效果就相对比较强。In animation design, the translucent effect between the inner model and the outer model is usually set, for example, the translucent display of the skin model and the clothes model. Correspondingly, the translucent display mainly depends on the effect displayed after the light reflected by the inner layer object penetrates the outer layer object after a certain distance and is incident on the human eye. Among them, each model is composed of points one by one. When the distance between the point on the inner layer object and the corresponding point on the outer layer object is farther, the translucent effect is weaker, and vice versa, the translucent effect is relatively strong.
相关技术中,确定半透明效果,主要是设定外层模型的透明度显示值,该透明度显示值通常是固定不变的,外层模型上的所有点依据设定的透明度显示值来实现透明显示,此种方式产生的透明显示效果与实际情况存在一定的偏差,透明显示效果不佳,用户体验较差。In the related art, determining the translucent effect is mainly to set the transparency display value of the outer layer model. The transparency display value is usually fixed, and all points on the outer layer model realize the transparent display according to the set transparency display value. , the transparent display effect produced by this method has a certain deviation from the actual situation, the transparent display effect is not good, and the user experience is poor.
发明内容SUMMARY OF THE INVENTION
本申请实施例提供一种确定透明度的方法、装置、电子设备及存储介质,使透明显示效果与实际相符,提高了用户体验。Embodiments of the present application provide a method, apparatus, electronic device, and storage medium for determining transparency, so that the transparent display effect is consistent with reality and user experience is improved.
第一方面,本申请实施例提供了一种确定透明度的方法,该方法包括:In a first aspect, an embodiment of the present application provides a method for determining transparency, the method comprising:
确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息;Determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model;
对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息;所述第一子模型为包裹所述第二子模型局部的子模型;重建函数是基于目标检测点在球谐函数中各基函数上的投影系数值构建出来的;For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point The distance information between the first sub-model and the second sub-model under the current shooting angle information; the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is based on the target detection point in It is constructed from the projection coefficient values on each basis function in the spherical harmonic function;
当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;所述待碰撞点为第 二子模型上与目标检测点所对应的碰撞点;When it is detected that the first sub-model is translucent, the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行显示。Based on the transparency information of each point to be collided with, the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is displayed based on the attribute information.
第二方面,本申请实施例还提供了一种确定透明度的装置,包括:In a second aspect, the embodiments of the present application also provide a device for determining transparency, including:
相对拍摄角度信息确定模块,设置为确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息;a relative shooting angle information determination module, configured to determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model;
距离信息确定模块,设置为对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息;所述第一子模型为包裹所述第二子模型局部的子模型;重建函数是基于目标检测点在球谐函数中各基函数上的投影系数值构建出来的;The distance information determination module is configured to determine the target reconstruction function corresponding to the current target detection point for each target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function , to obtain the distance information between the first sub-model and the second sub-model of the target detection point under the current shooting angle information; the first sub-model is a sub-model that wraps a part of the second sub-model; reconstruction The function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function;
透明度信息确定模块,设置为当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;所述待碰撞点为第二子模型上与目标检测点所对应的碰撞点;The transparency information determination module is configured to determine the transparency information of the to-be-collision point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent; The collision point is the collision point corresponding to the target detection point on the second sub-model;
透明显示模块,设置为基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行显示。The transparent display module is configured to adjust the attribute information of each to-be-collision point based on the transparency information of each to-be-collision point, so that each to-be-collision point on the second sub-model is displayed based on the attribute information.
第三方面,本申请实施例还提供了一种电子设备,该电子设备包括:In a third aspect, an embodiment of the present application also provides an electronic device, the electronic device comprising:
一个或多个处理器;one or more processors;
存储装置,设置为存储一个或多个程序;storage means arranged to store one or more programs;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本申请实施例中任一项所述的确定透明度的方法。When the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining transparency described in any one of the embodiments of the present application.
第四方面,本申请实施例还提供了一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时设置为执行本申请实施例中任一项所述的确定透明度的方法。In a fourth aspect, the embodiments of the present application further provide a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform the determination described in any one of the embodiments of the present application method of transparency.
附图说明Description of drawings
图1为本申请一实施例提供的一种确定透明度的方法的流程示意图;1 is a schematic flowchart of a method for determining transparency according to an embodiment of the present application;
图2为本申请另一实施例提供的一种确定透明度的方法的流程示意图;2 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application;
图3为本申请另一实施例提供的一种确定透明度的方法的流程示意图;3 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application;
图4为本申请另一实施例提供的一种确定透明度的方法的流程示意图;4 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application;
图5为本申请一实施例提供的一种确定透明度的装置的结构示意图;5 is a schematic structural diagram of a device for determining transparency according to an embodiment of the present application;
图6为本申请一实施例提供的一种电子设备的结构示意图。FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
具体实施方式Detailed ways
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的实施例仅仅用于解释本申请,而非对本申请的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本申请相关的部分而非全部结构。The present application will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the embodiments described herein are only used to explain the present application, but not to limit the present application. In addition, it should be noted that, for the convenience of description, the drawings only show some but not all the structures related to the present application.
图1为本申请一实施例提供的一种确定透明度的方法的流程示意图,本实施例可适用于根据预先存储的投影系数值,并根据投影系数值确定出与每个目标检测点所对应的目标重建函数,以根据各目标检测点的目标重建函数确定出相应视觉角度下目标检测点所对应的第一子模型和第二子模型之间的距离信息,进而基于距离信息透明显示的情形,该方法可以由确定透明度的装置来执行,该装置可以通过软件和/或硬件的形式实现,该硬件可以是电子设备,例如,电子设备可以是移动终端等。FIG. 1 is a schematic flowchart of a method for determining transparency provided by an embodiment of the present application. This embodiment can be applied to determine a target detection point corresponding to each target detection point according to a pre-stored projection coefficient value. The target reconstruction function is to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point at the corresponding visual angle according to the target reconstruction function of each target detection point, and then based on the situation of transparent display of the distance information, The method may be performed by an apparatus for determining transparency, the apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, for example, the electronic device may be a mobile terminal or the like.
S110、确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息。S110. Determine relative shooting angle information corresponding to the shooting device and each target detection point on the first sub-model.
其中,第一子模型与第二子模型是相对而言的,若应用场景为游戏场景,子模型可以是皮肤模型和衣服模型,可以将衣服所对应的模型作为第一子模型,皮肤所对应的模型作为第二子模型。目标检测点可以是第一子模型上预先设置的检测点;也可以是将第一子模型上的预先设置的点划分为多个块,可以将每个块的中心点作为目标检测点;也可以是开发人员根据实际需求设置的检测点;当然,也可以是,第一子模型是由多个点构成的,可以将每个点作为目标检测点。拍摄装置是用于观测和拍摄第一子模型的装置,相对拍摄角度是拍摄装置与第一子模型上每个目标检测点之间的相对角度。相对拍摄角度信息是根据拍摄装置与各目标检测点之间的位置信息来确定的。Among them, the first sub-model and the second sub-model are relative. If the application scene is a game scene, the sub-models can be a skin model and a clothes model, and the model corresponding to the clothes can be used as the first sub-model. model as the second submodel. The target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; of course, it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point. The photographing device is a device for observing and photographing the first sub-model, and the relative photographing angle is the relative angle between the photographing device and each target detection point on the first sub-model. The relative shooting angle information is determined according to the position information between the shooting device and each target detection point.
例如,拍摄装置与每个目标检测点所对应的目标拍摄角度存在差异,根据拍摄装置与第一子模型上每个目标检测点的相对位置关系,可以确定拍摄装置与每个目标检测点的相对角度信息,并可以将该角度信息作为相对拍摄角度信息。For example, there is a difference in the target shooting angle corresponding to the shooting device and each target detection point. According to the relative positional relationship between the shooting device and each target detection point on the first submodel, the relative position of the shooting device and each target detection point can be determined. angle information, and the angle information can be used as relative shooting angle information.
S120、对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与当前目标检测点对应的相对拍摄角度信息输入至目标重建函数中,得到目标检测点在当前拍摄角度信息下第一子模型与第二子模型之间的距离信息。S120. For each target detection point, determine a target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point at the current shooting angle The distance information between the first sub-model and the second sub-model under the information.
其中,第一子模型为包裹第二子模型局部的子模型;重建函数是基于目标 检测点在球谐函数中各基函数上的投影系数值构建出来的。每个目标检测点存在一个与其相对应的目标重建函数。Among them, the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function. Each object detection point has a corresponding object reconstruction function.
为了清楚的介绍本实施例技术方案,可以以重建其中一个目标检测点的重建函数为例来介绍。In order to clearly introduce the technical solution of this embodiment, the reconstruction function of reconstructing one of the target detection points may be used as an example for introduction.
其中,目标检测点的重建函数是根据球谐函数对当前目标检测点的投影系数进行处理后,得到当前目标检测点向空间中每个方向发射物理射线时,物理射线与第二子模型碰撞时所对应的距离值构建出的函数。重建函数可以用于对输入的相对拍摄角度进行处理,以确定在该相对拍摄角度下目标检测点所对应的第一子模型与第二子模型之间的距离信息。Among them, the reconstruction function of the target detection point is to process the projection coefficient of the current target detection point according to the spherical harmonic function, so as to obtain when the current target detection point emits physical rays in each direction in space, when the physical rays collide with the second sub-model The function constructed from the corresponding distance value. The reconstruction function can be used to process the input relative shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point under the relative shooting angle.
例如,基于预先设置的球谐函数中的每个基函数对当前目标检测点所对应的投影系数值进行处理,可以得到当前目标检测点向空间中每个方向发射物理射线时,物理射线与第二子模型碰撞时的碰撞点与当前目标检测点的距离信息。根据当前目标检测点所对应的空间中每个方向的距离信息,构建当前目标检测点所对应的重建函数。For example, by processing the projection coefficient value corresponding to the current target detection point based on each basis function in the preset spherical harmonic functions, it can be obtained that when the current target detection point emits physical rays in each direction in space, the physical ray and the first The distance information between the collision point and the current target detection point when the two sub-models collide. According to the distance information of each direction in the space corresponding to the current target detection point, a reconstruction function corresponding to the current target detection point is constructed.
在确定当前目标检测点与目标拍摄装置之间的目标拍摄角度后,可以将目标拍摄角度输入至与当前目标检测点所对应的重建函数中,重建函数可以对目标拍摄角度进行处理,输出目标拍摄角度下当前目标检测点与目标拍摄装置所属直线与第二子模型碰撞时,碰撞点与当前目标检测点之间的距离信息。After determining the target shooting angle between the current target detection point and the target shooting device, the target shooting angle can be input into the reconstruction function corresponding to the current target detection point. The reconstruction function can process the target shooting angle and output the target shooting angle. The distance information between the collision point and the current target detection point when the current target detection point and the line belonging to the target photographing device collide with the second sub-model under the angle.
示例性的,在确定某个目标检测点A的重建函数后,并在确定该目标检测点与拍摄装置之间的相对拍摄角度为45°时,可以将相对拍摄角度输入至目标检测点A的重建函数值,可以得到目标检测点A在该角度下目标检测点A所对应的第一子模型和第二子模型之前的距离信息,如,得到的距离信息为5nm。Exemplarily, after the reconstruction function of a certain target detection point A is determined, and when it is determined that the relative shooting angle between the target detection point and the shooting device is 45°, the relative shooting angle can be input to the target detection point A. Reconstructing the function value, the distance information before the first sub-model and the second sub-model corresponding to the target detection point A at this angle can be obtained. For example, the obtained distance information is 5 nm.
需要说明的是,每个目标检测点与拍摄装置之间的相对拍摄角度存在一定的差异,可以确定每个目标检测点与拍摄装置之间的相对拍摄角度,并将确定的角度作为与每个目标检测点的相对拍摄角度。It should be noted that there is a certain difference in the relative shooting angle between each target detection point and the shooting device, and the relative shooting angle between each target detection point and the shooting device can be determined, and the determined angle is used as the The relative shooting angle of the target detection point.
S130、当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息。S130. When it is detected that the first sub-model is translucent display, according to the distance information corresponding to each target detection point, determine the transparency information of the to-be-collision point corresponding to each target detection point.
其中,半透明显示可以理解为第一子模型所采用的材料为半透明材料。与每个目标检测点所对应的距离信息可以相同也可以不同。每个目标检测点所对应的距离信息指的是拍摄装置与目标检测点所属的直线与第二子模型碰撞时,目标检测点与待碰撞点之间的距离。透明度信息是用于表示模型显示时的透明程度,可以使用百分比来表示,例如:透明度为80%等。The translucent display can be understood as the material used in the first sub-model is a translucent material. The distance information corresponding to each target detection point may be the same or different. The distance information corresponding to each target detection point refers to the distance between the target detection point and the point to be collided when the line to which the photographing device and the target detection point belong collide with the second sub-model. Transparency information is used to indicate the degree of transparency when the model is displayed, which can be expressed as a percentage, for example: the transparency is 80% and so on.
需要说明的是,确定不同目标检测点的透明度参数是采用相同的方式,为 了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的透明度参数为例来介绍。It should be noted that the same method is used to determine the transparency parameters of different target detection points. In order to clearly introduce the technical solution of this embodiment, the transparency parameters of one of the target detection points are determined as an example to introduce.
例如,根据与当前目标检测点所对应的第一子模型和第二子模型之间的距离信息,确定当前目标检测点的透明度参数可以是:根据距离信息在预先存储的距离信息与透明度参数之间的对应关系中确定透明度参数,也可以是根据预先设置的透明度参数计算模型来计算,将距离信息输入至透明度参数计算模型中,经过计算可以获取与该距离信息对应的透明度参数。For example, according to the distance information between the first sub-model and the second sub-model corresponding to the current target detection point, determining the transparency parameter of the current target detection point may be: according to the distance information, between the pre-stored distance information and the transparency parameter The transparency parameter is determined in the corresponding relationship between the two, or it can be calculated according to a preset transparency parameter calculation model, and the distance information is input into the transparency parameter calculation model, and the transparency parameter corresponding to the distance information can be obtained through calculation.
示例性的,将距离信息记为dist,若0nm<dist≤5nm,透明度参数为90%,5nm<dist≤10nm,透明度参数为80%,当dist=7.5nm时,对应的透明度参数为80%。Exemplarily, the distance information is recorded as dist, if 0nm<dist≤5nm, the transparency parameter is 90%, 5nm<dist≤10nm, the transparency parameter is 80%, when dist=7.5nm, the corresponding transparency parameter is 80% .
当然,也可以采用相应的计算方式来确定透明度参数,其中,透明度计算公式为a i=f(l i),其中,a i表示第i个目标检测点,l i表示第i个目标检测点所对应的第一子模型和第二子模型之间的距离信息,f为具有单调性,且是单调递减的函数。 Of course, a corresponding calculation method can also be used to determine the transparency parameter, wherein the transparency calculation formula is a i =f(li i ), where a i represents the ith target detection point, and li represents the ith target detection point For the corresponding distance information between the first sub-model and the second sub-model, f is a monotonic and monotonically decreasing function.
在确定目标检测点的透明度参数之后,为了使第一子模型和第二子模型的视觉体验效果更好,可以将第一子模型上的目标检测点按照相对应的透明度参数进行显示,并将目标检测点与目标拍摄装置所属直线与第二子模型的交点按照相对应的透明度参数进行显示,以得到透明显示的效果。After determining the transparency parameters of the target detection points, in order to make the visual experience of the first sub-model and the second sub-model better, the target detection points on the first sub-model can be displayed according to the corresponding transparency parameters, and the The intersection of the target detection point and the line to which the target photographing device belongs and the second sub-model is displayed according to the corresponding transparency parameter, so as to obtain the effect of transparent display.
根据每个目标检测点,确定每个目标检测点的透明度参数,可以实现将第一子模型和第二子模型进行透明显示的效果。According to each target detection point, the transparency parameter of each target detection point is determined, and the effect of transparently displaying the first sub-model and the second sub-model can be realized.
例如,根据各目标检测点所对应的距离信息,可以在预先存储的每个距离信息与透明度参数的对应关系中,确定与距离信息相对应的透明度参数,可以包括第一子模型的透明度参数以及第二子模型的透明度参数,以便后续透明显示时使用。在确定目标拍摄角度下各目标检测点所对应的用于显示第一子模型和第二子模型的透明度参数后,基于上述透明度参数可以调节与第一子模型上各目标检测点所对应的待碰撞点的属性信息。For example, according to the distance information corresponding to each target detection point, the transparency parameter corresponding to the distance information may be determined in the pre-stored correspondence between each distance information and the transparency parameter, which may include the transparency parameter of the first sub-model and The transparency parameter of the second submodel for subsequent transparent display. After determining the transparency parameters for displaying the first sub-model and the second sub-model corresponding to each target detection point under the target shooting angle, the to-be-to-be-detected points corresponding to each target detection point on the first sub-model can be adjusted based on the above-mentioned transparency parameters. Attribute information of the collision point.
例如,所述当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息,包括:当检测到第二子模型为半透明显示时,根据与各目标检测点所对应的距离信息,按照预先建立的距离信息和透明度信息之间的对应关系,确定各目标检测点所对应的待碰撞点的透明度信息。For example, when it is detected that the first sub-model is translucent, the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point, including: when the first sub-model is detected When the two sub-models are displayed semi-transparently, according to the distance information corresponding to each target detection point, and according to the corresponding relationship between the pre-established distance information and transparency information, determine the transparency information of the to-be-collision point corresponding to each target detection point .
其中,可以预先建立距离信息和透明度参数之间的对应关系,可以根据对应关系确定各目标检测点的透明度参数。如,距离信息为5mm时,根据对应关 系确定透明度参数为0.5,;距离信息为10mm时,根据对应关系确定透明度参数为0.2。The corresponding relationship between the distance information and the transparency parameter may be established in advance, and the transparency parameter of each target detection point may be determined according to the corresponding relationship. For example, when the distance information is 5mm, the transparency parameter is determined to be 0.5 according to the corresponding relationship; when the distance information is 10mm, the transparency parameter is determined to be 0.2 according to the corresponding relationship.
也就是说,在确定距离信息后,可以根据距离信息和透明度参数对应关系,确定各目标检测点对应的透明度信息。That is to say, after the distance information is determined, the transparency information corresponding to each target detection point can be determined according to the corresponding relationship between the distance information and the transparency parameter.
S140、基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行透明显示。S140. Based on the transparency information of each point to be collided, adjust the attribute information of each point to be collided, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
其中,待碰撞点为第二子模型上的点。针对各目标检测点,在确定当前目标检测点与拍摄装置连线后,可以与第二子模型碰撞得到待碰撞点。即,待碰撞点是在各目标检测点的相对拍摄角度下,第二子模型上与各目标检测点所对应的点。属性信息可以是待碰撞点的颜色属性,如,待碰撞点与对应目标检测点之间的距离信息越小,待碰撞点的颜色深度约深,透明显示效果越好;待碰撞点与对应目标检测点之间的距离信息越大,待碰撞点的颜色深度越浅,透明度显示效果相对弱一点,与实际理论相符。The point to be collided is a point on the second sub-model. For each target detection point, after it is determined that the current target detection point is connected to the photographing device, the point to be collided can be obtained by colliding with the second sub-model. That is, the point to be collided is a point corresponding to each target detection point on the second sub-model under the relative shooting angle of each target detection point. The attribute information can be the color attribute of the point to be collided. For example, the smaller the distance information between the point to be collided and the corresponding target detection point, the deeper the color depth of the point to be collided, and the better the transparent display effect; The greater the distance information between the detection points, the lighter the color depth of the point to be collided with, and the relatively weaker transparency display effect, which is consistent with the actual theory.
例如,在确定各目标检测点的透明度信息后,可以根据透明度信息调整与目标检测点所对应的第二碰撞点的颜色属性,如,距离信息越大,待碰撞点的颜色越深,反之越浅,此种方式实现了在外层半透明显示时,可以调整各目标检测点所对应的待碰撞点的颜色深度,进而实现透明显示。本实施例的技术方案,通过确定的目标拍摄角度,以及从顶点色和/或属性信息中调取投影系数值,目标检测点所对应的第一子模型和第二子模型之间的距离信息,进而确定透明度参数并进行显示,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。For example, after determining the transparency information of each target detection point, the color attribute of the second collision point corresponding to the target detection point can be adjusted according to the transparency information. For example, the greater the distance information, the darker the color of the point to be collided, and vice versa. In this way, when the outer layer is semi-transparently displayed, the color depth of the to-be-collision point corresponding to each target detection point can be adjusted, thereby realizing transparent display. In the technical solution of this embodiment, the distance information between the first sub-model and the second sub-model corresponding to the target detection point is obtained by determining the target shooting angle and retrieving the projection coefficient value from the vertex color and/or attribute information. , and then determine the transparency parameter and display it, which solves the technical problem of poor transparent display effect and poor user experience caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value. Adjust the transparency parameter according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
在上述技术方案的基础上,所述基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行显示,包括:根据各待碰撞点的透明度信息,调整各待碰撞点的颜色深度值,以基于所述第二子模型上各待碰撞点的颜色深度值进行透明显示。其中,属性信息可以是颜色深度值。此时,由于第一子模型采用的是半透明材料,可以根据每个目标检测点对应的透明度参数,调整与各目标检测点所对应的第二碰撞点的颜色深度,如距离越近颜色深度值越大,反之,待碰撞点的颜色越浅。On the basis of the above technical solution, the attribute information of each point to be collided is adjusted based on the transparency information of each point to be collided, so that each point to be collided on the second sub-model is displayed based on the attribute information, The method includes: adjusting the color depth value of each point to be collided according to the transparency information of each point to be collided, so as to perform transparent display based on the color depth value of each point to be collided on the second sub-model. The attribute information may be a color depth value. At this time, since the first sub-model uses a translucent material, the color depth of the second collision point corresponding to each target detection point can be adjusted according to the transparency parameter corresponding to each target detection point. For example, the closer the distance, the color depth. The larger the value, the lighter the color of the point to be collided with.
需要说明的是,可以根据实际情况设置不同透明度参数所对应的颜色属性,即颜色深度值,在确定透明度参数后,可以根据透明度参数对应的颜色深度值调整各待碰撞点的颜色深度值。It should be noted that the color attributes corresponding to different transparency parameters, that is, the color depth value, can be set according to the actual situation. After the transparency parameter is determined, the color depth value of each point to be collided can be adjusted according to the color depth value corresponding to the transparency parameter.
例如,在确定每个目标检测点所对应的第一子模型和第二子模型之间的距离信息后,可以获取第一子模型和第二子模型的属性信息,结合属性信息和距离信息,综合来确定每个目标检测点所对应的透明度信息,从而使第一子模型和第二子模型透明显示。For example, after determining the distance information between the first sub-model and the second sub-model corresponding to each target detection point, the attribute information of the first sub-model and the second sub-model can be obtained, combining the attribute information and the distance information, The transparency information corresponding to each target detection point is determined comprehensively, so that the first sub-model and the second sub-model are transparently displayed.
图2为本申请另一实施例提供的一种确定透明度的方法的流程示意图,在上述实施例的基础上,可以根据球谐函数以及投影系数值重建出重建函数,以基于重建函数确定各目标检测点在与拍摄装置之间的相对拍摄角度下,各目标检测点所对应的距离信息,进而确定各目标检测点所对应的透明度参数,其具体实施方式可参见下述具体描述。其中,与上述各实施例相同或相应的术语的解释在此不再赘述。2 is a schematic flowchart of a method for determining transparency provided by another embodiment of the present application. On the basis of the above embodiment, a reconstruction function can be reconstructed according to the spherical harmonic function and the projection coefficient value, so as to determine each target based on the reconstruction function Under the relative shooting angle between the detection point and the shooting device, the distance information corresponding to each target detection point, and then determine the transparency parameter corresponding to each target detection point, the specific implementation can refer to the following specific description. Wherein, the explanations of terms that are the same as or corresponding to the above embodiments are not repeated here.
参见图2,本实施例提供的确定透明度的方法包括:Referring to FIG. 2, the method for determining transparency provided by this embodiment includes:
S210、根据预先接收到的每个目标检测点在球谐函数中各基函数的投影系数值,确定与每个目标检测点所对应的目标重建函数。S210: Determine a target reconstruction function corresponding to each target detection point according to the pre-received projection coefficient values of each base function in the spherical harmonic function of each target detection point.
其中,目标重建函数用于对输入的当前拍摄角度进行处理,得到与当前拍摄角度所对应的目标检测点所属第一子模型与第二子模型之间的距离信息。每个目标检测点存在与其相对应的目标重建函数。在以每个目标检测点发射射线时,可以向空间中的每个方向发射物理射线,并确定各物理射线与第一子模型和第二子模型之间的距离信息,球谐函数对每个距离信息进行处理,可以将所有距离压缩为几个函数值,可以将此函数值作为投影系数值。通过球谐函数对每个目标检测点所对应的投影系数值处理,可以重建出与目标检测点所对应的球形函数,以便将相对拍摄角度输入至该球形函数,即重建函数中,得到该角度下的距离信息。The target reconstruction function is used to process the input current shooting angle to obtain distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs. Each object detection point has its corresponding object reconstruction function. When ray is emitted at each target detection point, physical ray can be emitted to each direction in space, and the distance information between each physical ray and the first sub-model and the second sub-model can be determined. The distance information is processed, and all distances can be compressed into several function values, which can be used as projection coefficient values. By processing the projection coefficient value corresponding to each target detection point by the spherical harmonic function, the spherical function corresponding to the target detection point can be reconstructed, so that the relative shooting angle can be input into the spherical function, that is, the reconstruction function, and the angle can be obtained. distance information below.
例如,可以获取预先存储的与各目标检测点所对应的投影系数值,并基于球谐函数对投影系数值进行处理,重建出各目标检测点所对应的目标重建函数,该目标重建函数为可以确定各相对拍摄角度下第一子模型与第二子模型距离信息的函数。For example, the pre-stored projection coefficient value corresponding to each target detection point can be obtained, and the projection coefficient value can be processed based on the spherical harmonic function, and the target reconstruction function corresponding to each target detection point can be reconstructed, and the target reconstruction function can be A function of determining the distance information between the first sub-model and the second sub-model at each relative shooting angle.
S220、确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息。S220. Determine relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model.
S230、对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息。S230. For each target detection point, determine a target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target The detection point is the distance information between the first sub-model and the second sub-model under the current shooting angle information.
S240、当检测到第一子模型为半透明显示时,根据各目标检测点所对应的 距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息。S240, when it is detected that the first sub-model is a translucent display, according to the distance information corresponding to each target detection point, determine the transparency information of the point to be collided corresponding to each target detection point.
S250、基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行透明显示。S250. Based on the transparency information of each point to be collided, adjust the attribute information of each point to be collided, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
本申请实施例的技术方案,可以根据球谐函数对各目标检测点所对应的投影系数值进行重建处理,得到各目标检测点的重建函数,在将各目标检测点所对应的相对拍摄角度输入至相应的重建函数中,可以得到各目标检测点所对应的第一子模型和第二子模型之间的距离信息,进一步结合距离信息确定透明度参数信息,进而根据距离信息确定与各目标检测点所对应的颜色属性,动态调整第二子模型上各待碰撞点的颜色属性,使模型透明显示。In the technical solutions of the embodiments of the present application, the projection coefficient values corresponding to each target detection point can be reconstructed according to the spherical harmonic function to obtain the reconstruction function of each target detection point, and the relative shooting angle corresponding to each target detection point is input In the corresponding reconstruction function, the distance information between the first sub-model and the second sub-model corresponding to each target detection point can be obtained, and the transparency parameter information can be further determined in combination with the distance information, and then determined according to the distance information and each target detection point. The corresponding color attribute dynamically adjusts the color attribute of each point to be collided on the second sub-model, so that the model is transparently displayed.
图3为本申请另一实施例提供的一种确定透明度的方法的流程示意图。在根据预先接收到的每个目标检测点在球谐函数中各基函数的投影系数值,确定与每个目标检测点所对应的目标重建函数之前,还包括:确定各目标检测点的投影系数值。其具体的实施方式可以参见下述介绍,其中,与上述实施例相同或者相应的技术术语在此不再赘述。FIG. 3 is a schematic flowchart of a method for determining transparency according to another embodiment of the present application. Before determining the target reconstruction function corresponding to each target detection point according to the pre-received projection coefficient values of each target detection point in the spherical harmonic function of each basis function, the method further includes: determining the projection coefficient of each target detection point value. The specific implementation can refer to the following introduction, wherein, the technical terms that are the same as or corresponding to the above-mentioned embodiments are not repeated here.
如图3所示,所述方法包括:As shown in Figure 3, the method includes:
S310、确定每个目标检测点在所述球谐函数中各基函数上的投影系数值。S310. Determine the projection coefficient value of each target detection point on each basis function in the spherical harmonic function.
在本实施例中,确定各目标检测点的投影系数值,可以是:对第一子模型上的各目标检测点,确定当前目标检测点向各方向发射物理射线透过第二子模型时的待处理碰撞点信息,根据所述当前目标检测点与各待处理碰撞点信息,确定当前目标碰撞在各方向上的距离值。根据每个目标检测点在各方向上的距离值,确定与每个目标检测点相对应的距离函数。基于所述球谐函数对各目标检测点的距离函数进行处理,得到各目标检测点在球谐函数中各基函数上投影系数值;所述球谐函数中包括多个基函数。In this embodiment, determining the projection coefficient value of each target detection point may be: for each target detection point on the first sub-model, determining the current target detection point emits physical rays in all directions when it passes through the second sub-model. For the collision point information to be processed, the distance values of the current target collision in each direction are determined according to the current target detection point and the information of each collision point to be processed. According to the distance values of each target detection point in each direction, a distance function corresponding to each target detection point is determined. The distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained; the spherical harmonic function includes a plurality of basis functions.
其中,第一子模型与第二子模型是相对而言的,若应用场景为皮肤模型和衣服模型时,可以将衣服所对应的模型作为第一子模型,皮肤所对应的模型作为第二子模型。目标检测点可以是第一子模型上预先设置的检测点;也可以是将第一子模型上的预先设置的点划分为多个块,可以将每个块的中心点作为目标检测点;也可以是开发人员根据实际需求设置的检测点;当然,也可以是,第一子模型是由多个点构成的,可以将每个点作为目标检测点。距离函数是与每个目标检测点相对应的,即一个目标检测点对应一个距离函数。针对每个距离函数,可以理解为:可以根据某个目标检测点在空间中的每个方向上与第二子模型之间的相对距离信息确定出来的。Among them, the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model. Model. The target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; of course, it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point. The distance function corresponds to each target detection point, that is, a target detection point corresponds to a distance function. For each distance function, it can be understood as: it can be determined according to the relative distance information between a certain target detection point and the second sub-model in each direction in space.
由于是球面分布的距离函数,每个目标检测点在各方向与第二子模型之间 的距离,可以是:以目标检测点为球心,向空间中每个方向发射物理射线,确定各条物理射线与第二子模型的交点与目标检测点之间距离信息。Since it is a distance function of spherical distribution, the distance between each target detection point and the second sub-model in each direction can be: take the target detection point as the center of the sphere, emit physical rays in each direction in space, and determine each Distance information between the intersection of the physical ray and the second sub-model and the target detection point.
示例性的,若目标检测点有1000个,距离函数的数量也为1000个,其中每个距离函数可以是基于某个目标检测点在每个方向,与第二子模型之间的距离确定出的距离函数。Exemplarily, if there are 1000 target detection points, the number of distance functions is also 1000, wherein each distance function can be determined based on the distance between a certain target detection point and the second sub-model in each direction. distance function.
需要说明的是,确定不同目标检测点的球面分布距离函数是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的球面分布函数为例来介绍。It should be noted that the same method is used to determine the spherical distribution distance function of different target detection points. In order to clearly introduce the technical solution of this embodiment, the spherical distribution function of one of the target detection points is determined as an example to introduce.
例如,可以以第一子模型上的一个目标检测点作为球心,向空间中的每个方向发射物理射线,确定各条物理射线与第二子模型的交点与目标检测点之间距离信息。若物理射线在该射线的方向上与第二子模型存在交点,则将该交点与目标检测点的距离作为目标检测点在该方向上与第二子模型之间的距离信息。若球形射线在该射线的方向上与第二子模型不存在交点,则可以将预设的最大距离信息作为目标检测点在该方向上与第二子模型之间的距离信息。For example, a target detection point on the first sub-model can be used as the center of the sphere, physical rays can be emitted in every direction in space, and the distance information between the intersection of each physical ray and the second sub-model and the target detection point can be determined. If the physical ray has an intersection with the second sub-model in the direction of the ray, the distance between the intersection and the target detection point is used as the distance information between the target detection point and the second sub-model in the direction. If the spherical ray does not have an intersection with the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in this direction.
根据目标检测点在空间中的每个方向上的确定的距离信息,可以确定目标检测点的球面分布的距离函数,例如:目标检测点A的球面分布距离函数为According to the determined distance information of the target detection point in each direction in space, the distance function of the spherical distribution of the target detection point can be determined. For example, the spherical distribution distance function of the target detection point A is:
Figure PCTCN2021131500-appb-000001
Figure PCTCN2021131500-appb-000001
其中,i表示第i个方向,F(i)表示第i个方向上的距离信息,dist_i表示具体的距离信息,n表示方向的总数量。Among them, i represents the ith direction, F(i) represents the distance information in the ith direction, dist_i represents the specific distance information, and n represents the total number of directions.
需要说明的是,每个目标检测点的球面分布的距离函数是一个复合函数,该复合函数中子函数的个数可以根据预设的采样数量确定,默认的可以是16×32的精度,即复合函数中包含512个子函数。当采样数量越大时,复合函数中的子函数数量越多,精度越高。具体的采样数量可以根据实际需求来确定。It should be noted that the distance function of the spherical distribution of each target detection point is a composite function, and the number of sub-functions in the composite function can be determined according to the preset number of samples. The default can be 16×32 precision, that is, The composite function contains 512 sub-functions. When the number of samples is larger, the larger the number of sub-functions in the composite function, the higher the precision. The specific sampling quantity can be determined according to actual needs.
还需要说明的是,确定不同目标检测点的投影系数值是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的的投影系数值为例来介绍。It should also be noted that the same method is used to determine the projection coefficient values of different target detection points. In order to clearly introduce the technical solution of this embodiment, the projection coefficient value of one of the target detection points is determined as an example.
其中,球谐函数由多个基函数构成,可以使用基函数对每个距离函数进行处理。球谐函数不同的阶数对应的基函数数量不同,二阶球谐函数包含4个基函数,三阶球谐函数包含9个基函数,四阶球谐函数包含16个基函数等。投影 系数值是根据球谐函数中的基函数对距离函数进行处理后,得到的系数值,其系数值的数量与基函数的数量相同。根据球谐函数与其对应的多个基函数可以对目标检测点的球面分布距离函数进行压缩,以得到投影系数值。Among them, the spherical harmonic function is composed of multiple basis functions, and each distance function can be processed by using the basis function. Different orders of spherical harmonics correspond to different numbers of basis functions. The second-order spherical harmonics include 4 basis functions, the third-order spherical harmonics include 9 basis functions, and the fourth-order spherical harmonics include 16 basis functions. The projection coefficient value is the coefficient value obtained by processing the distance function according to the basis function in the spherical harmonic function, and the number of coefficient values is the same as that of the basis function. According to the spherical harmonic function and its corresponding multiple basis functions, the spherical distribution distance function of the target detection point can be compressed to obtain the projection coefficient value.
示例性的:球谐函数为二阶,即包括4个基函数,那么将目标检测点所对应的距离函数输入至该球谐函数中可以得到4个投影系数值,以此将距离函数压缩为4个投影系数值。Exemplary: the spherical harmonic function is second-order, that is, it includes 4 basis functions, then inputting the distance function corresponding to the target detection point into the spherical harmonic function can obtain 4 projection coefficient values, so as to compress the distance function into 4 projection coefficient values.
需要说明的是,球谐函数的阶数越高,在重建时重建出来的球形与实际球形的相似度就越高,因此开发人员可以根据实际需求选择不同阶数的球谐函数。越高阶的球谐函数包含的基函数越多,在后续根据球谐函数以及投影系数值还原距离函数时的还原程度越高。It should be noted that the higher the order of the spherical harmonic function, the higher the similarity between the reconstructed sphere and the actual sphere during reconstruction, so developers can choose spherical harmonic functions of different orders according to actual needs. The higher-order spherical harmonic function contains more basis functions, and the degree of restoration of the distance function is higher in the subsequent restoration of the distance function according to the spherical harmonic function and the projection coefficient value.
例如,在确定球谐函数的具体阶数后,可以将目标检测点的球面分布的距离函数输入至球谐函数中,基于球谐函数中的基函数对距离函数进行处理,以获得上述距离函数在每个基函数上的投影系数值。由此可知,投影系数值的数量与球谐函数中的基函数的数量相等。For example, after determining the specific order of the spherical harmonic function, the distance function of the spherical distribution of the target detection point can be input into the spherical harmonic function, and the distance function is processed based on the basis function in the spherical harmonic function to obtain the above distance function. The projection coefficient values on each basis function. It follows that the number of projection coefficient values is equal to the number of basis functions in spherical harmonics.
需要说明的是,不同目标检测点的球面分布的距离函数可以相同也可以不同,当不同目标检测点在空间中每个方向上的距离信息都相同时,不同目标检测点所对应的距离函数相同,当不同目标检测点在空间中每个方向上的距离信息存在至少一个不同时,不同目标检测点所对应的距离函数不同,其中,距离函数是否相同是根据目标检测点在空间中每个方向上的距离信息来确定的。将不同的距离函数输入至预先设置的球谐函数进行处理后,得到的投影系数值不同。It should be noted that the distance functions of the spherical distribution of different target detection points can be the same or different. When the distance information of different target detection points in each direction in space is the same, the distance functions corresponding to different target detection points are the same. , when there is at least one difference in the distance information of different target detection points in each direction in space, the distance functions corresponding to different target detection points are different, and whether the distance function is the same is based on the target detection points in each direction in space. The distance information on it is determined. After inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
S320、将每个目标检测点的投影系数值按照空间位置关系对应存储至至少一副图像中,并将所述至少一副图像存储至目标位置,以从所述目标位置获取所有目标检测点在对应阶球谐函数上的投影系数值。S320. Store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store the at least one image in the target position, so as to obtain all target detection points at the target position from the target position. The value of the projection coefficient on the corresponding spherical harmonic function.
在上述技术方案的基础上,在确定各目标检测点所对应的投影系数值之后,还包括,将投影系数值存储至至少一副图像中,以从至少一副图像中获取相应目标检测点的投影系数值,从而重建出相应的目标重建函数。Based on the above technical solution, after determining the projection coefficient value corresponding to each target detection point, the method further includes: storing the projection coefficient value in at least one image, so as to obtain the corresponding target detection point from the at least one image. Projection coefficient value, so as to reconstruct the corresponding objective reconstruction function.
例如,将每个目标检测点的投影系数值按照空间位置关系对应存储至至少一副图像中,并将所述至少一副图像存储至目标位置,以从所述目标位置获取所有目标检测点在对应阶球谐函数上的投影系数值。For example, the projection coefficient value of each target detection point is correspondingly stored in at least one image according to the spatial position relationship, and the at least one image is stored in the target position, so as to obtain all target detection points in the target position from the target position. The value of the projection coefficient on the corresponding spherical harmonic function.
需要说明的是,存储不同目标检测点的投影系数值的方式是相同的,可以以存储某个目标检测点所对应的投影系数值为例来介绍。It should be noted that the method of storing the projection coefficient values of different target detection points is the same, and it can be introduced by taking the storage of the projection coefficient value corresponding to a certain target detection point as an example.
其中,若球谐函数是二阶的,则基函数的数量可以有四个,通过球谐函数 中的基函数对距离函数进行处理后,可以得到四个投影系数值。可以将投影系数值按照空间坐标位置信息存储至图像中,例如,投影系数值有四个,则可以采用四幅图像来存储投影系数值,每幅图像中的每个像素点存储一个投影系数值。可以将至少一副图像存储至引擎中,可以将存储至引擎中的位置作为目标位置。将图像存储至目标位置的好处在于:可以从目标位置获取存储的投影系数值,以便根据投影系数值构建出重建函数。Among them, if the spherical harmonic function is second-order, the number of basis functions can be four, and after the distance function is processed by the basis functions in the spherical harmonic function, four projection coefficient values can be obtained. The projection coefficient values can be stored in the image according to the spatial coordinate position information. For example, if there are four projection coefficient values, four images can be used to store the projection coefficient values, and each pixel in each image stores a projection coefficient value. At least one image may be stored in the engine, and the location stored in the engine may be used as the target location. The advantage of storing the image to the target location is that the stored projection coefficient values can be obtained from the target location so that a reconstruction function can be constructed from the projection coefficient values.
例如,可以根据投影系数值的数量,确定图像的数量,并将投影系数值按照空间坐标信息存储至图像中的相应像素点。For example, the number of images may be determined according to the number of projection coefficient values, and the projection coefficient values may be stored in corresponding pixel points in the image according to the spatial coordinate information.
示例性的,球谐函数的阶数为3阶,则球谐函数中基函数的数量为9个,则每个目标检测点存在9个投影系数值。此时,可以确定九幅图像,并按照空间坐标信息将每个目标检测点的九个投影系数值存储至每幅图像中的对应像素点中。Exemplarily, if the order of the spherical harmonic function is 3, then the number of basis functions in the spherical harmonic function is 9, and each target detection point has 9 projection coefficient values. At this time, nine images can be determined, and the nine projection coefficient values of each target detection point can be stored in the corresponding pixel points in each image according to the spatial coordinate information.
需要说明的,也可以将投影系数值存储至顶点色中。可以将第一子模型上的每个目标检测点作为顶点色,每个目标检测点对应有像素通道,即RGBA四个通道,可以在每个通道中存储该目标检测点的投影系数值。属性信息可以是每个目标检测点所对应的可扩展信息,例如:UV,即u,v纹理贴图坐标。目标检测点所对应的顶点色以及当前目标检测点的属性信息可以共同使用来存储投影系数值。It should be noted that the projection coefficient value can also be stored in the vertex color. Each target detection point on the first sub-model can be used as a vertex color, each target detection point corresponds to a pixel channel, that is, four RGBA channels, and the projection coefficient value of the target detection point can be stored in each channel. The attribute information can be scalable information corresponding to each target detection point, for example: UV, ie u, v texture map coordinates. The vertex color corresponding to the target detection point and the attribute information of the current target detection point can be used together to store the projection coefficient value.
例如,将当前目标检测点所对应的投影系数值存储至与当前目标检测点所对应的顶点色中时,可以是按R,G,B以及A四个通道分别存储4个投影系数值,并且,可以根据投影系数值的数量确定需要顶点色的数量。也可以在一个顶点色中存储4个投影系数值,并将剩余的投影系数值存储至与当前目标检测点所对应的UV坐标中,上述方式可以减少顶点色的使用,并且便于投影系数值的存储以及后续的调取和使用。For example, when the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point, four projection coefficient values can be stored respectively according to the four channels of R, G, B and A, and , the number of vertex colors required can be determined according to the number of projection coefficient values. It is also possible to store 4 projection coefficient values in one vertex color, and store the remaining projection coefficient values in the UV coordinates corresponding to the current target detection point. The above method can reduce the use of vertex colors and facilitate the calculation of projection coefficient values. storage and subsequent recall and use.
例如,还可以将目标检测点所对应的顶点色与当前目标检测点所对应的UV坐标共同使用来存储投影系数值,例如:若存储9个投影系数值,可以使用两个顶点色存储8个投影系数值,并将剩余的1个投影系数值存储至与当前目标检测点所对应的UV坐标中。For example, the vertex color corresponding to the target detection point and the UV coordinate corresponding to the current target detection point can also be used together to store the projection coefficient value. For example, if 9 projection coefficient values are stored, two vertex colors can be used to store 8 projection coefficient value, and store the remaining 1 projection coefficient value in the UV coordinate corresponding to the current target detection point.
当投影系数值完成存储后,可以根据当前目标检测点的索引坐标导入引擎中的目标位置进行存储,以根据目标检测点从引擎中调取与当前目标检测点所对应投影系数值,以根据投影系数值重建后的目标距离信息,确定各目标检测点的透明度参数,进而基于各透明度参数显示第一子模型和第二子模型。After the projection coefficient value is stored, it can be imported into the target position in the engine according to the index coordinates of the current target detection point for storage, so that the projection coefficient value corresponding to the current target detection point can be retrieved from the engine according to the target detection point. The target distance information reconstructed by the coefficient value determines the transparency parameter of each target detection point, and then displays the first sub-model and the second sub-model based on each transparency parameter.
S330、针对各目标检测点,从目标位置调取至少一副图像,并确定当前目 标检测点在至少一副图像中的目标像素点位置,并根据调取的各投影系数值,得到当前目标检测点所对应的目标重建函数。S330. For each target detection point, retrieve at least one image from the target position, determine the target pixel position of the current target detection point in the at least one image, and obtain the current target detection point according to the retrieved projection coefficient values The target reconstruction function corresponding to the point.
在实际应用过程中,若需要透明显示第一子模型上的每个目标检测点时,需要确定各目标检测点在相对拍摄角度下的距离信息,进而根据距离信息确定透明度参数。在确定各目标检测点在相对拍摄角度下的距离信息时,可以根据目标位置存储的投影系数值重建出与各目标检测点所对应的重建函数,以便将相对拍摄角度输入至相应的重建函数中,得到各目标检测点所对应的距离信息。In the actual application process, if each target detection point on the first sub-model needs to be displayed transparently, the distance information of each target detection point at the relative shooting angle needs to be determined, and then the transparency parameter is determined according to the distance information. When determining the distance information of each target detection point under the relative shooting angle, the reconstruction function corresponding to each target detection point can be reconstructed according to the projection coefficient value stored at the target position, so that the relative shooting angle can be input into the corresponding reconstruction function , to obtain the distance information corresponding to each target detection point.
例如,针对每个目标检测点,可以获取当前目标检测点空间坐标信息,可以根据空间坐标信息,确定当前目标检测点在每一幅图像中的目标像素点位置,可以从每一幅图像的目标像素点位置获取存储的投影系数值。根据球谐函数对各存储的投影系数值进行重建处理,得到当前目标检测点向空间中每个方向发射物理射线时,各物理射线所对应当前目标检测点与第二子模型碰撞时,当前目标检测点与待碰撞点之间的距离信息,并根据各距离信息得到的距离函数,即每个目标检测点所对应的目标重建函数为距离函数。For example, for each target detection point, the spatial coordinate information of the current target detection point can be obtained, and the target pixel position of the current target detection point in each image can be determined according to the spatial coordinate information. The pixel position gets the stored projection coefficient value. Reconstruct each stored projection coefficient value according to the spherical harmonic function, and obtain when the current target detection point emits physical rays in each direction in the space, when the current target detection point corresponding to each physical ray collides with the second sub-model, the current target detection point The distance information between the detection point and the point to be collided, and the distance function obtained according to each distance information, that is, the target reconstruction function corresponding to each target detection point is the distance function.
S340、将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息。S340. Input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point between the first sub-model and the second sub-model under the current shooting angle information distance information.
S350、将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息。S350. Input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point between the first sub-model and the second sub-model under the current shooting angle information distance information.
S360、当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息。S360. When it is detected that the first sub-model is translucent display, according to the distance information corresponding to each target detection point, determine the transparency information of the to-be-collision point corresponding to each target detection point.
S370、基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行透明显示。S370. Based on the transparency information of each point to be collided, adjust the attribute information of each point to be collided, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
本申请实施例的技术方案,通过预先确定与每个目标检测点相对应的投影系数值,并将投影系数值存储至预设数量的贴图中,以在透明显示时,可以从相应位置调取存储的投影系数值,进而基于投影系数值重建出重建函数,从而基于重建函数确定各目标检测点在不同相对拍摄角度下的距离信息。In the technical solution of the embodiments of the present application, the projection coefficient value corresponding to each target detection point is pre-determined, and the projection coefficient value is stored in a preset number of maps, so that when the transparent display is performed, it can be retrieved from the corresponding position The stored projection coefficient value, and then reconstruct a reconstruction function based on the projection coefficient value, so as to determine the distance information of each target detection point under different relative shooting angles based on the reconstruction function.
图4为本申请另一实施例提供的一种确定透明度的方法的流程示意图。在前述实施例的基础上,确定距离函数的具体确定方式,可以参见本实施例技术方案。其中与上述各实施例相同或相应的术语的解释在此不再赘述。FIG. 4 is a schematic flowchart of a method for determining transparency according to another embodiment of the present application. On the basis of the foregoing embodiment, for the specific determination method of determining the distance function, reference may be made to the technical solution of this embodiment. The explanations of terms that are the same as or corresponding to the above embodiments are not repeated here.
如图4所示,所述方法包括:As shown in Figure 4, the method includes:
S410、对第一子模型上的各目标检测点,确定当前目标检测点向各方向发 射物理射线透过第二子模型时的各待处理碰撞点信息。S410. For each target detection point on the first sub-model, determine the information of each to-be-processed collision point when the current target detection point emits physical rays in all directions and passes through the second sub-model.
其中,各待处理碰撞点的信息可以包括各待碰撞点的位置信息,例如:空间坐标信息等。例如,以当前目标检测点为球心,向空间中的任意方向发射物理射线,记录各物理射线与所述第二子模型相交时的待处理碰撞点,并确定各待处理碰撞点与目标检测点之间的距离值。The information of each collision point to be processed may include position information of each collision point to be collided, such as spatial coordinate information and the like. For example, take the current target detection point as the center of the sphere, emit physical rays in any direction in space, record the collision points to be processed when each physical ray intersects the second sub-model, and determine the collision points to be processed and the target detection point. Distance value between points.
例如,可以是基于第一子模型上的各目标检测点向每个方向发射物理射线,上述各条物理射线可能会透过第二子模型,当物理射线透过第二子模型时,确定物理射线与第二子模型存在待处理碰撞点,并将物理射线与第二子模型的交点作为待处理碰撞点。待处理碰撞点信息可以是用于描述该点位置的信息,如空间坐标信息。因此,根据待处理碰撞点可以确定与待处理碰撞点相对应的空间坐标信息为待处理碰撞点信息。For example, physical rays may be emitted in each direction based on each target detection point on the first sub-model, and the above-mentioned physical rays may pass through the second sub-model. There is a collision point to be processed between the ray and the second sub-model, and the intersection of the physical ray and the second sub-model is used as the collision point to be processed. The collision point information to be processed may be information used to describe the position of the point, such as spatial coordinate information. Therefore, according to the collision point to be processed, it can be determined that the spatial coordinate information corresponding to the collision point to be processed is the information of the collision point to be processed.
在本实施例中,确定待处理碰撞点信息可以是:可以以当前目标检测点为球心,向空间任意方向发射物理射线,确定各物理射线透过第二子模型时的待处理碰撞点信息。In this embodiment, determining the information of the collision point to be processed may be: taking the current target detection point as the center of the sphere, emitting physical rays in any direction in space, and determining the information of the collision point to be processed when each physical ray passes through the second sub-model .
例如,可以将从第一子模型上的各目标检测点向每个方向发射物理射线看作是以当前目标检测点为球心,向球面的每个方向发射物理射线。若物理射线透过第二子模型,则将物理射线与第二子模型交点的空间坐标信息作为待处理碰撞点信息。For example, the emission of physical rays in each direction from each target detection point on the first sub-model can be regarded as taking the current target detection point as the center of the sphere, and emitting physical rays in each direction of the spherical surface. If the physical ray passes through the second sub-model, the spatial coordinate information of the intersection of the physical ray and the second sub-model is used as the collision point information to be processed.
S420、根据当前目标检测点与各待处理碰撞点信息,确定当前目标检测点在各方向上与所述第二子模型之间的距离信息。S420: Determine the distance information between the current target detection point and the second sub-model in all directions according to the current target detection point and the information of each collision point to be processed.
当物理射线与第二子模型存在待处理碰撞点时,则确定待处理碰撞点信息与当前目标检测点的距离信息。When there is a collision point to be processed between the physical ray and the second sub-model, the distance information between the information of the collision point to be processed and the current target detection point is determined.
例如,当存在待处理碰撞点时,可以根据当前目标检测点的空间坐标信息以及待处理碰撞点的空间坐标信息,使用空间中两点之间距离的计算公式计算得出待处理碰撞点信息与当前目标检测点的距离信息。For example, when there is a collision point to be processed, according to the spatial coordinate information of the current target detection point and the spatial coordinate information of the collision point to be processed, the calculation formula of the distance between two points in the space can be used to calculate the difference between the information of the collision point to be processed and the spatial coordinate information of the collision point to be processed. The distance information of the current target detection point.
当物理射线与第二子模型不存在待处理碰撞点时,则将与待处理碰撞点所对应的距离信息设置为设定值。When there is no collision point to be processed between the physical ray and the second sub-model, the distance information corresponding to the collision point to be processed is set as a set value.
例如,若物理射线不透过第二子模型,如,发射的物理射线与第二子模型是平行的或者是朝着背向第二子模型的方向发射的物理射线,此时物理射线与第二子模型之间不存在碰撞点,可以将此时的待处理碰撞信息设定为设定值。该设定值可以是待处理碰撞点信息与当前目标检测点的最大距离信息。For example, if the physical ray does not pass through the second sub-model, for example, the emitted physical ray is parallel to the second sub-model or the physical ray emitted in the direction away from the second sub-model, at this time, the physical ray and the first There is no collision point between the two sub-models, and the collision information to be processed at this time can be set as the set value. The set value may be the maximum distance information between the collision point information to be processed and the current target detection point.
根据与每个待处理碰撞点所对应的距离信息或设定值,确定当前目标检测点在各方向上与第二子模型之间的距离信息。According to the distance information or set value corresponding to each collision point to be processed, the distance information between the current target detection point and the second sub-model in each direction is determined.
例如,基于上述两种情况,可以确定当前目标检测点发射的各条物理射线所对应的距离信息或设定值,并将距离信息或设定值作为当前目标检测点在各方向上与第二子模型之间的距离信息。For example, based on the above two situations, the distance information or setting value corresponding to each physical ray emitted by the current target detection point can be determined, and the distance information or setting value can be used as the current target detection point in all directions with the second Distance information between submodels.
S430、根据每个目标检测点在各方向上的距离信息,确定相应目标检测点的球面分布的距离函数。S430 , according to the distance information of each target detection point in each direction, determine the distance function of the spherical distribution of the corresponding target detection point.
例如,将每个目标检测点在空间中每个方向上的距离信息作为目标检测点的球面分布的距离函数中的一个子函数,可以得到目标检测点的球面分布的距离函数。该距离函数中的子函数数量与目标检测点在球面每个方向上的距离信息数量相同。For example, taking the distance information of each target detection point in each direction in space as a sub-function of the distance function of the spherical distribution of the target detection point, the distance function of the spherical distribution of the target detection point can be obtained. The number of sub-functions in this distance function is the same as the amount of distance information of the target detection point in each direction of the sphere.
需要说明的是,为了提高精度,可以增加子函数的数量,即增加物理射线的密度,具体物理射线的数量可以根据实际需求来确定。It should be noted that, in order to improve the accuracy, the number of sub-functions can be increased, that is, the density of physical rays can be increased, and the specific number of physical rays can be determined according to actual requirements.
S440、确定球谐函数的阶数以及距离函数,确定基函数的数量和基函数的表示方式。S440. Determine the order of the spherical harmonic function and the distance function, and determine the number of basis functions and the representation of the basis functions.
其中,不同阶数的球谐函数中包含的基函数数量不同,例如:二阶球谐函数包含4个基函数,三阶球谐函数包含9个基函数,四阶球谐函数包含16个基函数等。球谐函数的阶数越高,在后续使用重建函数进行重建时的效果会越好,具体的阶数需要根据实际需求来设定。Among them, the number of basis functions contained in spherical harmonics of different orders is different. For example, the second-order spherical harmonic function contains 4 basis functions, the third-order spherical harmonic function contains 9 basis functions, and the fourth-order spherical harmonic function contains 16 basis functions. function etc. The higher the order of the spherical harmonic function, the better the effect of subsequent reconstruction using the reconstruction function. The specific order needs to be set according to actual needs.
例如,根据需求确定球谐函数的阶数为a,那么可以确定球谐函数中的基函数数量为a^2。根据距离函数与投影系数值的关系可以确定每个基函数的表示方式。For example, if the order of the spherical harmonic function is determined to be a according to the requirements, then the number of basis functions in the spherical harmonic function can be determined to be a^2. The representation of each basis function can be determined according to the relationship between the distance function and the value of the projection coefficient.
S450、针对每个目标检测点,基于球谐函数中的各基函数对当前目标检测点的距离函数进行处理,得到当前目标检测点的投影系数值。S450. For each target detection point, process the distance function of the current target detection point based on each basis function in the spherical harmonic function to obtain a projection coefficient value of the current target detection point.
需要说明的是,确定不同目标检测点的投影系数值是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的的投影系数值为例来介绍。It should be noted that the same method is used to determine the projection coefficient values of different target detection points. In order to clearly introduce the technical solution of this embodiment, the projection coefficient value of one of the target detection points is determined as an example.
其中,投影系数值的数量与基函数的数量相同。投影系数值是使用预先设置的球谐函数中的每个基函数对距离函数进行计算确定的值。where the number of projection coefficient values is the same as the number of basis functions. The projection coefficient value is a value determined by calculating the distance function using each basis function of the preset spherical harmonic functions.
例如,基于每一个基函数对目标检测点的距离函数进行处理都可以得到与该基函数对应的投影系数值,因此,投影系数值的数量与基函数的数量相同。将目标检测点的距离函数输入至球谐函数的每个基函数中,可以获得上述距离函数在每个基函数上的投影系数值。For example, by processing the distance function of the target detection point based on each basis function, a projection coefficient value corresponding to the basis function can be obtained. Therefore, the number of projection coefficient values is the same as the number of basis functions. By inputting the distance function of the target detection point into each basis function of the spherical harmonic function, the projection coefficient value of the above distance function on each basis function can be obtained.
需要说明的是,每个目标检测点的球面分布的距离函数不同,将不同的距 离函数输入至预先设置的球谐函数进行处理后,得到的投影系数值不同。It should be noted that the distance functions of the spherical distribution of each target detection point are different, and after inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
S460、将每个目标检测点的投影系数值按照空间位置关系对应存储至至少一副图像中,并将至少一副图像存储至目标位置,以从目标位置获取所有目标检测点在对应阶球谐函数上的投影系数值。S460, correspondingly store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store at least one image in the target position, so as to obtain from the target position all target detection points in the corresponding order spherical harmonics The projection coefficient value on the function.
其中,目标位置可以是存储至引擎中。引擎可以是已编写好的可编辑电脑游戏系统或者一些交互式实时图像应用程序的核心组件。目标位置可以是引擎中用于存储数据和/或信息的存储空间,在本实施例中是用于存储目标检测点的坐标信息的存储空间。Wherein, the target position can be stored in the engine. The engine can be the core component of a programmed editable computer game system or some interactive real-time graphics application. The target location may be a storage space used to store data and/or information in the engine, and in this embodiment, is a storage space used to store coordinate information of the target detection point.
例如,在确定当前目标检测点的投影系数值的数量后,可以获取与系数值数量相对应的多幅图像。根据当前目标检测点的空间坐标信息确定该空间坐标在图像中的存储位置,将当前目标检测点的投影系数值存储至图像中相应的存储位置中,以便从目标位置的图像中获取存储的投影系数值,从而重建出重建函数。即,若需要使用某一个目标检测点所对应的投影系数值,则可以根据目标检测点的空间坐标信息在引擎中的目标位置确定与该空间坐标信息相对应的投影系数值,以供后续重建函数使用。For example, after the number of projection coefficient values of the current target detection point is determined, multiple images corresponding to the number of coefficient values may be acquired. According to the spatial coordinate information of the current target detection point, determine the storage position of the spatial coordinate in the image, and store the projection coefficient value of the current target detection point in the corresponding storage position in the image, so as to obtain the stored projection from the image of the target position. coefficient values to reconstruct the reconstruction function. That is, if the projection coefficient value corresponding to a certain target detection point needs to be used, the projection coefficient value corresponding to the spatial coordinate information can be determined according to the target position of the spatial coordinate information of the target detection point in the engine for subsequent reconstruction. function use.
S470、确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息。S470. Determine the relative shooting angle information corresponding to the shooting device and each target detection point on the first sub-model.
S480、确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息。S480: Determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function, so as to obtain the target detection point at the current shooting point Distance information between the first sub-model and the second sub-model under the angle information.
S490、当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使第二子模型上的各待碰撞点基于属性信息进行透明显示。S490, when it is detected that the first sub-model is translucent display, according to the distance information corresponding to each target detection point, determine the transparency information of the to-be-collision point corresponding to each target detection point; based on the transparency information of each to-be-collision point , and adjust the attribute information of each point to be collided, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
在上述技术方案的基础上,需要说明的是,在确定与每个目标检测点相对应的距离函数后,还想要基于球谐函数对距离函数进行压缩处理得到投影系数值的好处在于:每个目标检测点所对应的距离函数可以是一个或者多个,此时需要存储的数据量比较大,为了降低存储空间的占用率,可以将投影函数压缩为几个投影系数值。On the basis of the above technical solutions, it should be noted that, after determining the distance function corresponding to each target detection point, the advantage of compressing the distance function based on the spherical harmonic function to obtain the projection coefficient value is: The distance function corresponding to each target detection point can be one or more. In this case, the amount of data to be stored is relatively large. In order to reduce the occupancy rate of the storage space, the projection function can be compressed into several projection coefficient values.
本申请实施例的技术方案,通过确定第一子模型上每个目标检测点的球面分布的距离函数,基于预先设置的球谐函数对距离函数进行处理确定投影系数并存储至顶点色和/或属性信息中,以根据存储的信息以及重建函数确定目标拍摄角度下的透明度参数并显示,解决了以固定透明度显示值对第一子模型进行 透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of the embodiment of the present application, by determining the distance function of the spherical distribution of each target detection point on the first sub-model, the distance function is processed based on the preset spherical harmonic function, the projection coefficient is determined, and the projection coefficient is stored in the vertex color and/or In the attribute information, the transparency parameter at the target shooting angle is determined and displayed according to the stored information and the reconstruction function, which solves the problem of transparency caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value. For technical problems with poor display effect and poor user experience, adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
图5为本申请一实施例提供的一种确定透明度的装置的结构示意图。如图5所示,所述装置包括:相对拍摄角度信息确定模块510、距离信息确定模块520、透明度信息确定模块530以及透明显示模块540。FIG. 5 is a schematic structural diagram of an apparatus for determining transparency according to an embodiment of the present application. As shown in FIG. 5 , the apparatus includes: a relative shooting angle information determination module 510 , a distance information determination module 520 , a transparency information determination module 530 and a transparent display module 540 .
其中,相对拍摄角度信息确定模块510,设置为确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息;距离信息确定模块520,设置为对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息;所述第一子模型为包裹所述第二子模型局部的子模型;重建函数是基于目标检测点在球谐函数中各基函数上的投影系数值构建出来的;透明度信息确定模块530,设置为当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;所述待碰撞点为第二子模型上与目标检测点所对应的碰撞点;透明显示模块540,设置为基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行显示。Wherein, the relative shooting angle information determination module 510 is configured to determine the relative shooting angle information corresponding to the shooting device and each target detection point on the first sub-model; the distance information determination module 520 is configured to determine the relative shooting angle information for each target detection point. The target reconstruction function corresponding to the current target detection point, and the relative shooting angle information corresponding to the current target detection point is input into the target reconstruction function to obtain the target detection point under the current shooting angle information. The distance information between the first sub-model and the second sub-model; the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is based on the target detection point on each basis function in the spherical harmonic function The transparency information determination module 530 is set to determine the waiting point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent display. Transparency information of the collision point; the collision point is the collision point corresponding to the target detection point on the second sub-model; the transparent display module 540 is set to adjust the attributes of each collision point based on the transparency information of the collision point information, so that each point to be collided on the second sub-model is displayed based on the attribute information.
在上述技术方案的基础上,所述装置还包括:确定目标重建函数模块,还设置为On the basis of the above technical solution, the device further includes: a module for determining the target reconstruction function, which is further set to
根据预先接收到的每个目标检测点在球谐函数中各基函数的系数值,确定与每个目标检测点所对应的目标重建函数;目标重建函数设置为对输入的当前拍摄角度进行处理,得到与当前拍摄角度所对应的目标检测点所属第一子模型与第二子模型之间的距离信息。Determine the target reconstruction function corresponding to each target detection point according to the pre-received coefficient values of each target detection point in the spherical harmonic function; the target reconstruction function is set to process the input current shooting angle, The distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs is obtained.
在上述各技术方案的基础上,所述确定目标重建函数模块,在设置为根据预先接收到的每个目标检测点在球谐函数中各基函数的系数值,确定与每个目标检测点所对应的目标重建函数之前,还设置为:确定每个目标检测点在所述球谐函数中各基函数上的投影系数值;所述确定每个目标检测点在所述球谐函数中各基函数上的投影系数值,包括:对第一子模型上的各目标检测点,确定当前目标检测点向各方向发射物理射线透过第二子模型时的待处理碰撞点信息,根据所述当前目标检测点与各待处理碰撞点信息,确定当前目标碰撞在各方向上的距离值;根据每个目标检测点在各方向上的距离值,确定与每个目标检测点相对应的距离函数;基于所述球谐函数对各目标检测点的距离函数进行处理,得到各目标检测点在球谐函数中各基函数上投影系数值;所述球谐函数中包括 多个基函数。On the basis of the above-mentioned technical solutions, the determining target reconstruction function module is configured to determine the correlation between each target detection point and each target detection point according to the pre-received coefficient values of each basis function in the spherical harmonic function of each target detection point. Before the corresponding target reconstruction function, it is also set to: determine the projection coefficient value of each target detection point on each basis function in the spherical harmonic function; The projection coefficient value on the function includes: for each target detection point on the first sub-model, determining the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point. The target detection point and the information of each collision point to be processed determine the distance value of the current target collision in each direction; according to the distance value of each target detection point in each direction, determine the distance function corresponding to each target detection point; The distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained; the spherical harmonic function includes a plurality of basis functions.
在上述各技术方案的基础上,所述距离信息确定模块,设置为确定当前目标检测点向各方向发射物理射线透过第二子模型时的待处理碰撞点信息,根据所述当前目标检测点与各待处理碰撞点信息,确定当前目标碰撞在各方向上的距离值,还设置为:以当前目标检测点为球心,向空间中的任意方向发射物理射线,记录各物理射线与所述第二子模型相交时的待处理碰撞点,并确定各待处理碰撞点与目标检测点之间的距离值;若物理射线与所述第二子模型不存在待处理碰撞点时,将与所述物理射线所对应的距离值标记为设定值;基于与各物理射线所对应的距离值和设定值,确定所述当前目标检测点在各方向上的距离值。On the basis of the above technical solutions, the distance information determination module is configured to determine the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point With the information of each collision point to be processed, determine the distance value of the current target collision in all directions, and also set to: take the current target detection point as the center of the sphere, emit physical rays in any direction in space, and record the relationship between each physical ray and the The collision point to be processed when the second sub-model intersects, and the distance value between each collision point to be processed and the target detection point is determined; if there is no collision point to be processed between the physical ray and the second sub-model, it will be The distance values corresponding to the physical rays are marked as set values; based on the distance values and set values corresponding to the physical rays, the distance values of the current target detection point in each direction are determined.
在上述各技术方案的基础上,所述装置还包括:投影系数值确定模块,设置为:确定所述球谐函数的阶数,并根据所述阶数确定基函数的表示方式以及基函数的数量;针对每个目标检测点,基于各基函数对当前目标检测点的距离函数进行处理,得到所述当前目标检测点在基函数方向上投影系数值;所述投影系数值的数量与所述基函数的数量相同。On the basis of the above technical solutions, the device further includes: a projection coefficient value determination module, configured to: determine the order of the spherical harmonic function, and determine the representation of the basis function and the basis function according to the order. Quantity; for each target detection point, the distance function of the current target detection point is processed based on each basis function to obtain the projection coefficient value of the current target detection point in the direction of the basis function; the number of the projection coefficient values is the same as that of the The number of basis functions is the same.
在上述各技术方案的基础上,所述装置还包括:投影系数值存储模块,还设置为将每个目标检测点的投影系数值按照空间位置关系对应存储至至少一副图像中,并将所述至少一副图像存储至目标位置,以从所述目标位置获取所有目标检测点在对应阶球谐函数上的投影系数值。On the basis of the above technical solutions, the device further includes: a projection coefficient value storage module, which is further configured to store the projection coefficient value of each target detection point in at least one image according to the spatial position relationship, and store all the projection coefficient values. The at least one image is stored in the target position, so as to obtain the projection coefficient values of all target detection points on the corresponding order spherical harmonic function from the target position.
在上述各技术方案的基础上,所述目标重建函数确定模块,还设置为:从所述目标位置调取所述至少一副图像,并确定所述当前碰撞点在所述至少一副图像中的目标像素点位置;分别从所述至少一副图像中的目标像素点位置处,调取存储的各投影系数值;基于各投影系数值以及球谐函数,确定与所述目标检测点所对应的目标重建函数。Based on the above technical solutions, the target reconstruction function determination module is further configured to: retrieve the at least one image from the target position, and determine that the current collision point is in the at least one image the target pixel point position; retrieve the stored projection coefficient values from the target pixel point position in the at least one image respectively; determine the corresponding target detection point based on the projection coefficient values and the spherical harmonic function The objective reconstruction function of .
在上述各技术方案的基础上,所述距离信息确定模块,还设置为:确定当前目标检测点与所述拍摄装置所对应的相对拍摄角度信息;将所述相对拍摄角度信息输入至所述目标重建函数中,确定所述当前碰撞点在所述相对拍摄角度下所对应的距离信息,所述距离信息为所述目标检测点和所述拍摄装置所属直线与第二子模型碰撞时,目标检测点与所述第二子模型碰撞点之间的距离信息。On the basis of the above technical solutions, the distance information determination module is further configured to: determine the relative shooting angle information corresponding to the current target detection point and the shooting device; input the relative shooting angle information to the target In the reconstruction function, the distance information corresponding to the current collision point under the relative shooting angle is determined, and the distance information is when the target detection point and the line belonging to the shooting device collide with the second sub-model, the target detection distance information between the point and the collision point of the second submodel.
在上述各技术方案的基础上,所述透明度显示模块,还设置为:当检测到第二子模型为半透明显示时,根据与各目标检测点所对应的距离信息,按照预先建立的距离信息和透明度信息之间的对应关系,确定各目标检测点所对应的待碰撞点的透明度信息。On the basis of the above technical solutions, the transparency display module is further configured to: when it is detected that the second sub-model is translucent display, according to the distance information corresponding to each target detection point, according to the pre-established distance information and the corresponding relationship between the transparency information to determine the transparency information of the to-be-collision point corresponding to each target detection point.
在上述各技术方案的基础上,所述透明度显示模块,还设置为根据各待碰撞点的透明度信息,调整各待碰撞点的颜色深度值,以基于所述第二子模型上各待碰撞点的颜色深度值进行透明显示。On the basis of the above technical solutions, the transparency display module is further configured to adjust the color depth value of each point to be collided according to the transparency information of each point to be collided, so as to adjust the color depth value of each point to be collided based on the point to be collided on the second sub-model The color depth value is displayed transparently.
本实施例的技术方案,通过确定第一子模型上每个目标检测点的球面分布的距离函数,基于预先设置的球谐函数对距离函数进行处理确定投影系数并存储至顶点色和/或属性信息中,以根据存储的信息以及重建函数确定目标拍摄角度下的透明度参数并显示,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of this embodiment, by determining the distance function of the spherical distribution of each target detection point on the first sub-model, the distance function is processed based on the preset spherical harmonic function to determine the projection coefficient and store it in the vertex color and/or attribute In the information, the transparency parameter under the target shooting angle is determined and displayed according to the stored information and the reconstruction function, which solves the transparent display caused by the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value. For technical problems with poor effect and poor user experience, adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
本申请实施例所提供的确定透明度的装置可执行本申请任意实施例所提供的确定透明度的方法,具备执行方法相应的功能模块和有益效果。The apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
图6为本申请一实施例提供的一种电子设备的结构示意图,图6示出了适于用来实现本申请实施例实施方式的示例性电子设备60的框图。图6显示的电子设备60仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and FIG. 6 shows a block diagram of an exemplary electronic device 60 suitable for implementing the embodiments of the present application. The electronic device 60 shown in FIG. 6 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
如图6所示,电子设备60以通用计算设备的形式表现。电子设备60的组件可以包括但不限于:一个或者多个处理器或者处理单元601,系统存储器602,连接不同系统组件(包括系统存储器602和处理单元601)的总线603。As shown in FIG. 6, electronic device 60 takes the form of a general-purpose computing device. Components of electronic device 60 may include, but are not limited to, one or more processors or processing units 601, system memory 602, and a bus 603 connecting different system components (including system memory 602 and processing unit 601).
总线603表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(Industry Standard Architecture,ISA)总线,微通道体系结构(Micro Channel Architecture,MCA)总线,增强型ISA总线、视频电子标准协会(Video Electronics Standards Association,VESA)局域总线以及外围组件互连(Peripheral Component Interconnect,PCI)总线。 Bus 603 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures. For example, these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards Association) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
电子设备60典型地包括多种计算机系统可读介质。这些介质可以是任何能够被电子设备60访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。 Electronic device 60 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by electronic device 60, including both volatile and non-volatile media, removable and non-removable media.
系统存储器602可以包括易失性存储器形式的计算机系统可读介质,例如随机存取存储器(Random Access Memory,RAM)604和/或高速缓存存储器605。电子设备60可以进一步包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系统606可以设置为读写不可移动的、非易 失性磁介质(图6未显示,通常称为“硬盘驱动器”)。尽管图6中未示出,可以提供设置为对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如CD-ROM,DVD-ROM或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线603相连。系统存储器602可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本申请每个实施例的功能。 System memory 602 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 604 and/or cache memory 605 . Electronic device 60 may further include other removable/non-removable, volatile/non-volatile computer system storage media. For example only, storage system 606 may be configured to read and write to non-removable, non-volatile magnetic media (not shown in Figure 6, commonly referred to as a "hard drive"). Although not shown in Figure 6, a magnetic disk drive configured to read and write to removable non-volatile magnetic disks (eg "floppy disks") and removable non-volatile optical disks (eg CD-ROM, DVD-ROM) may be provided or other optical media) to read and write optical drives. In these cases, each drive may be connected to bus 603 through one or more data media interfaces. System memory 602 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of each embodiment of the present application.
具有一组(至少一个)程序模块607的程序/实用工具608,可以存储在例如系统存储器602中,这样的程序模块607包括但不限于操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块607通常执行本申请所描述的实施例中的功能和/或方法。A program/utility 608 having a set (at least one) of program modules 607, which may be stored, for example, in system memory 602, such program modules 607 including, but not limited to, an operating system, one or more application programs, other program modules, and programs Data, each or some combination of these examples may include an implementation of a network environment. Program modules 607 generally perform the functions and/or methods of the embodiments described herein.
电子设备60也可以与一个或多个外部设备609(例如键盘、指向设备、显示器610等)通信,还可与一个或者多个使得用户能与该电子设备60交互的设备通信,和/或与使得该电子设备60能与一个或多个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口611进行。并且,电子设备60还可以通过网络适配器612与一个或者多个网络(例如局域网(Local Area Network,LAN),广域网(Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器612通过总线603与电子设备60的其它模块通信。应当明白,尽管图6中未示出,可以结合电子设备60使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Disks,RAID)系统、磁带驱动器以及数据备份存储系统等。The electronic device 60 may also communicate with one or more external devices 609 (eg, keyboards, pointing devices, display 610, etc.), with one or more devices that enable a user to interact with the electronic device 60, and/or with Any device (eg, network card, modem, etc.) that enables the electronic device 60 to communicate with one or more other computing devices. Such communication may take place through input/output (I/O) interface 611 . Also, the electronic device 60 may communicate with one or more networks (eg, Local Area Network (LAN), Wide Area Network (WAN), and/or public networks, such as the Internet) through a network adapter 612. As shown, network adapter 612 communicates with other modules of electronic device 60 via bus 603 . It should be understood that, although not shown in FIG. 6, other hardware and/or software modules may be used in conjunction with electronic device 60, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk arrays (Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems, etc.
处理单元601通过运行存储在系统存储器602中的程序,从而执行每个种功能应用以及数据处理,例如实现本申请实施例所提供的确定透明度的方法。The processing unit 601 executes each functional application and data processing by running the program stored in the system memory 602, for example, implementing the method for determining transparency provided by the embodiments of the present application.
本申请一实施例还提供一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时设置为执行一种确定透明度的方法。An embodiment of the present application further provides a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform a method of determining transparency.
该方法包括:The method includes:
确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息;Determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model;
对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息;所述第一子模型为包裹所述第二子模型局部的子模型;重建函数是基于目标检测点在球谐函数中各基函数上的投影系数值构建出来的;For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point The distance information between the first sub-model and the second sub-model under the current shooting angle information; the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is based on the target detection point in It is constructed from the projection coefficient values on each basis function in the spherical harmonic function;
当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;所述待碰撞点为第二子模型上与目标检测点所对应的碰撞点;When it is detected that the first sub-model is translucent, the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行透明显示。Based on the transparency information of each point to be collided, the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
本申请实施例的计算机存储介质,可以采用一个或多个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器(Random Access Memory,RAM)、只读存储器(Read-Only Memory,ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。The computer storage medium of the embodiments of the present application may adopt any combination of one or more computer-readable media. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above. Examples (a non-exhaustive list) of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read- Only Memory, ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the above. In this document, a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输设置为由指令执行系统、装置或者器件使用或者与其结合使用的程序。A computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing. A computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium, which can transmit, propagate or transport a program arranged for use by or in connection with the instruction execution system, apparatus or device .
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括——但不限于无线、电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
可以以一种或多种程序设计语言或其组合来编写设置为执行本申请实施例操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言——诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。Computer program code configured to perform the operations of the embodiments of the present application may be written in one or more programming languages, or combinations thereof, including object-oriented programming languages—such as Java, Smalltalk, C++, and also A conventional procedural programming language - such as the "C" language or similar programming language. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).

Claims (13)

  1. 一种确定透明度的方法,包括:A method of determining transparency, including:
    确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息;Determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model;
    对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息;所述第一子模型为包裹所述第二子模型局部的子模型;重建函数是基于目标检测点在球谐函数中各基函数上的投影系数值构建出来的;For each target detection point, determine the target reconstruction function corresponding to the current target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function to obtain the target detection point The distance information between the first sub-model and the second sub-model under the current shooting angle information; the first sub-model is a sub-model that wraps the part of the second sub-model; the reconstruction function is based on the target detection point in It is constructed from the projection coefficient values on each basis function in the spherical harmonic function;
    当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;所述待碰撞点为第二子模型上与目标检测点所对应的碰撞点;When it is detected that the first sub-model is translucent, the transparency information of the to-be-collision point corresponding to each target detection point is determined according to the distance information corresponding to each target detection point; the to-be-collision point is the second sub-model The collision point corresponding to the target detection point;
    基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行透明显示。Based on the transparency information of each point to be collided, the attribute information of each point to be collided is adjusted, so that each point to be collided on the second sub-model is transparently displayed based on the attribute information.
  2. 根据权利要求1所述的方法,还包括:The method of claim 1, further comprising:
    确定每个目标检测点所对应的目标重建函数;Determine the target reconstruction function corresponding to each target detection point;
    所述确定每个目标检测点所对应的目标重建函数,包括:The determining of the target reconstruction function corresponding to each target detection point includes:
    根据预先接收到的每个目标检测点在球谐函数中各基函数的投影系数值,确定与每个目标检测点所对应的目标重建函数;目标重建函数设置为对输入的当前拍摄角度进行处理,得到与当前拍摄角度所对应的目标检测点所属第一子模型与第二子模型之间的距离信息。According to the pre-received projection coefficient values of each target detection point in the spherical harmonic function, the target reconstruction function corresponding to each target detection point is determined; the target reconstruction function is set to process the input current shooting angle , to obtain the distance information between the first sub-model and the second sub-model to which the target detection point corresponding to the current shooting angle belongs.
  3. 根据权利要求1所述的方法,在根据预先接收到的每个目标检测点在球谐函数中各基函数的系数值,确定与每个目标检测点所对应的目标重建函数之前,还包括:The method according to claim 1, before determining the target reconstruction function corresponding to each target detection point according to the pre-received coefficient values of each basis function of each target detection point in the spherical harmonic function, further comprising:
    确定每个目标检测点在所述球谐函数中各基函数上的投影系数值;Determine the projection coefficient value of each target detection point on each basis function in the spherical harmonic function;
    所述确定每个目标检测点在所述球谐函数中各基函数上的投影系数值,包括:The determining of the projection coefficient value of each target detection point on each basis function in the spherical harmonic function includes:
    对第一子模型上的各目标检测点,确定当前目标检测点向各方向发射物理射线透过第二子模型时的待处理碰撞点信息,根据所述当前目标检测点与各待处理碰撞点信息,确定当前目标碰撞在各方向上的距离值;For each target detection point on the first sub-model, determine the collision point information to be processed when the current target detection point emits physical rays in all directions and passes through the second sub-model, according to the current target detection point and each collision point to be processed. information to determine the distance value of the current target collision in all directions;
    根据每个目标检测点在各方向上的距离值,确定与每个目标检测点相对应的距离函数;According to the distance value of each target detection point in each direction, determine the distance function corresponding to each target detection point;
    基于所述球谐函数对各目标检测点的距离函数进行处理,得到各目标检测点在球谐函数中各基函数上投影系数值;所述球谐函数中包括多个基函数。The distance function of each target detection point is processed based on the spherical harmonic function, and the projection coefficient value of each target detection point on each basis function in the spherical harmonic function is obtained; the spherical harmonic function includes a plurality of basis functions.
  4. 根据权利要求3所述的方法,其中,所述确定当前目标检测点向各方向发射物理射线透过第二子模型时的待处理碰撞点信息,根据所述当前目标检测点与各待处理碰撞点信息,确定当前目标碰撞在各方向上的距离值,包括:The method according to claim 3, wherein said determining the information of the collision points to be processed when the current target detection point emits physical rays in all directions and passing through the second sub-model, according to the current target detection point and each to-be-processed collision Point information, determine the distance value of the current target collision in all directions, including:
    以当前目标检测点为球心,向空间中的任意方向发射物理射线,记录各物理射线与所述第二子模型相交时的待处理碰撞点,并确定各待处理碰撞点与目标检测点之间的距离值;Taking the current target detection point as the center of the sphere, emit physical rays in any direction in space, record the collision points to be processed when each physical ray intersects the second sub-model, and determine the difference between each collision point to be processed and the target detection point. The distance value between;
    若物理射线与所述第二子模型不存在待处理碰撞点时,将与所述物理射线所对应的距离值标记为设定值;If there is no collision point to be processed between the physical ray and the second sub-model, mark the distance value corresponding to the physical ray as a set value;
    基于与各物理射线所对应的距离值和设定值,确定所述当前目标检测点在各方向上的距离值。Based on the distance value and the set value corresponding to each physical ray, the distance value of the current target detection point in each direction is determined.
  5. 根据权利要求3所述的方法,其中,所述基于所述球谐函数对各目标检测点的距离函数进行处理,得到各目标检测点在球谐函数中各基函数上投影系数值,包括:The method according to claim 3, wherein the processing of the distance function of each target detection point based on the spherical harmonic function to obtain the projection coefficient value of each target detection point on each basis function in the spherical harmonic function, comprising:
    确定所述球谐函数的阶数,并根据所述阶数确定基函数的表示方式以及基函数的数量;determining the order of the spherical harmonic function, and determining the representation of the basis function and the number of basis functions according to the order;
    针对每个目标检测点,基于各基函数对当前目标检测点的距离函数进行处理,得到所述当前目标检测点在基函数方向上投影系数值;所述投影系数值的数量与所述基函数的数量相同。For each target detection point, the distance function of the current target detection point is processed based on each basis function to obtain the projection coefficient value of the current target detection point in the direction of the basis function; the number of the projection coefficient values is related to the basis function. the same number.
  6. 根据权利要求3所述的方法,还包括:The method of claim 3, further comprising:
    将每个目标检测点的投影系数值按照空间位置关系对应存储至至少一副图像中,并将所述至少一副图像存储至目标位置,以从所述目标位置获取所有目标检测点在对应阶球谐函数上的投影系数值。The projection coefficient value of each target detection point is correspondingly stored in at least one pair of images according to the spatial position relationship, and the at least one pair of images is stored in the target position, so as to obtain all target detection points from the target position at the corresponding level. Projection coefficient values on spherical harmonics.
  7. 根据权利要求6所述的方法,其中,所述确定与当前目标检测点相对应的目标重建函数,包括:The method according to claim 6, wherein the determining the target reconstruction function corresponding to the current target detection point comprises:
    针对各目标检测点,从所述目标位置调取所述至少一副图像,并确定所述当前目标检测点在所述至少一副图像中的目标像素点位置;For each target detection point, retrieve the at least one image from the target position, and determine the target pixel position of the current target detection point in the at least one image;
    分别从所述至少一副图像中的目标像素点位置处,调取存储的各投影系数值;Respectively retrieve the stored projection coefficient values from the position of the target pixel point in the at least one pair of images;
    基于球谐函数对与所述当前目标检测点所对应的各投影系数值进行重建处理,得到所述当前目标检测点所对应的目标重建函数。Reconstruction processing is performed on each projection coefficient value corresponding to the current target detection point based on the spherical harmonic function to obtain a target reconstruction function corresponding to the current target detection point.
  8. 根据权利要求1所述的方法,其中,所述将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息,包括:The method according to claim 1, wherein the relative shooting angle information corresponding to the current target detection point is input into the target reconstruction function to obtain the target detection point under the current shooting angle information Distance information between the first sub-model and the second sub-model, including:
    确定当前目标检测点与所述拍摄装置所对应的相对拍摄角度信息;determining the relative shooting angle information corresponding to the current target detection point and the shooting device;
    将所述相对拍摄角度信息输入至所述目标重建函数中,确定所述当前碰撞点在所述相对拍摄角度下所对应的距离信息,所述距离信息为所述目标检测点和所述拍摄装置所属直线与第二子模型碰撞时,目标检测点与所述第二子模型碰撞点之间的距离信息。Input the relative shooting angle information into the target reconstruction function, and determine the distance information corresponding to the current collision point under the relative shooting angle, where the distance information is the target detection point and the shooting device The distance information between the target detection point and the collision point of the second sub-model when the line to which it belongs collides with the second sub-model.
  9. 根据权利要求1所述的方法,其中,所述当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息,包括:The method according to claim 1, wherein, when it is detected that the first sub-model is translucent display, according to the distance information corresponding to each target detection point, the distance of the to-be-collision point corresponding to each target detection point is determined. Transparency information, including:
    当检测到第二子模型为半透明显示时,根据与各目标检测点所对应的距离信息,按照预先建立的距离信息和透明度信息之间的对应关系,确定各目标检测点所对应的待碰撞点的透明度信息。When it is detected that the second sub-model is translucent, according to the distance information corresponding to each target detection point, and according to the pre-established correspondence between the distance information and the transparency information, determine the to-be-collision corresponding to each target detection point Transparency information for the point.
  10. 根据权利要求1所述的方法,其中,所述基于各待碰撞点的透明度信息,调整各待碰撞点的属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行显示,包括:The method according to claim 1, wherein the attribute information of each point to be collided is adjusted based on the transparency information of each point to be collided, so that each point to be collided on the second sub-model is based on the attribute information display, including:
    根据各待碰撞点的透明度信息,调整各待碰撞点的颜色深度值,以基于所述第二子模型上各待碰撞点的颜色深度值进行透明显示。According to the transparency information of each point to be collided, the color depth value of each point to be collided is adjusted, so as to perform transparent display based on the color depth value of each point to be collided on the second sub-model.
  11. 一种确定透明度的装置,包括:A device for determining transparency, comprising:
    相对拍摄角度信息确定模块,设置为确定拍摄装置与第一子模型上每个目标检测点所对应的相对拍摄角度信息;a relative shooting angle information determination module, configured to determine the relative shooting angle information corresponding to each target detection point on the shooting device and the first sub-model;
    距离信息确定模块,设置为对每个目标检测点,确定与当前目标检测点相对应的目标重建函数,并将与所述当前目标检测点对应的相对拍摄角度信息输入至所述目标重建函数中,得到所述目标检测点在所述当前拍摄角度信息下第一子模型与第二子模型之间的距离信息;所述第一子模型为包裹所述第二子模型局部的子模型;重建函数是基于目标检测点在球谐函数中各基函数上的投影系数值构建出来的;The distance information determination module is configured to determine the target reconstruction function corresponding to the current target detection point for each target detection point, and input the relative shooting angle information corresponding to the current target detection point into the target reconstruction function , to obtain the distance information between the first sub-model and the second sub-model of the target detection point under the current shooting angle information; the first sub-model is a sub-model that wraps a part of the second sub-model; reconstruction The function is constructed based on the projection coefficient value of the target detection point on each basis function in the spherical harmonic function;
    透明度信息确定模块,设置为当检测到第一子模型为半透明显示时,根据各目标检测点所对应的距离信息,确定与各目标检测点所对应的待碰撞点的透明度信息;所述待碰撞点为第二子模型上与目标检测点所对应的碰撞点;The transparency information determination module is configured to determine the transparency information of the to-be-collision point corresponding to each target detection point according to the distance information corresponding to each target detection point when it is detected that the first sub-model is translucent; The collision point is the collision point corresponding to the target detection point on the second sub-model;
    透明显示模块,设置为基于各待碰撞点的透明度信息,调整各待碰撞点的 属性信息,以使所述第二子模型上的各待碰撞点基于所述属性信息进行显示。The transparent display module is configured to adjust the attribute information of each point to be collided based on the transparency information of each point to be collided, so that each point to be collided on the second sub-model is displayed based on the attribute information.
  12. 一种电子设备,包括:An electronic device comprising:
    一个或多个处理器;one or more processors;
    存储装置,设置为存储一个或多个程序;storage means arranged to store one or more programs;
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-10中任一项所述的确定透明度的方法。The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of determining transparency of any of claims 1-10.
  13. 一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时设置为执行如本权利要求1-10中任一项所述的确定透明度的方法。A storage medium containing computer-executable instructions which, when executed by a computer processor, are arranged to perform the method of determining transparency of any of the present claims 1-10.
PCT/CN2021/131500 2020-12-08 2021-11-18 Transparency determination method and apparatus, and electronic device and storage medium WO2022121654A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011445988.4A CN114627040A (en) 2020-12-08 2020-12-08 Method and device for determining transparency, electronic equipment and storage medium
CN202011445988.4 2020-12-08

Publications (1)

Publication Number Publication Date
WO2022121654A1 true WO2022121654A1 (en) 2022-06-16

Family

ID=81895496

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/131500 WO2022121654A1 (en) 2020-12-08 2021-11-18 Transparency determination method and apparatus, and electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN114627040A (en)
WO (1) WO2022121654A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6361438B1 (en) * 1997-07-25 2002-03-26 Konami Co., Ltd. Video game transparency control system for images
US20070018996A1 (en) * 2005-07-25 2007-01-25 Microsoft Corporation Real-time rendering of partially translucent objects
CN106023300A (en) * 2016-05-09 2016-10-12 深圳市瑞恩宁电子技术有限公司 Body rendering method and system of semitransparent material
CN111243075A (en) * 2020-03-17 2020-06-05 广东趣炫网络股份有限公司 Method, device and equipment for generating water depth map for hand tour

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6361438B1 (en) * 1997-07-25 2002-03-26 Konami Co., Ltd. Video game transparency control system for images
US20070018996A1 (en) * 2005-07-25 2007-01-25 Microsoft Corporation Real-time rendering of partially translucent objects
CN106023300A (en) * 2016-05-09 2016-10-12 深圳市瑞恩宁电子技术有限公司 Body rendering method and system of semitransparent material
CN111243075A (en) * 2020-03-17 2020-06-05 广东趣炫网络股份有限公司 Method, device and equipment for generating water depth map for hand tour

Also Published As

Publication number Publication date
CN114627040A (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN111815755A (en) Method and device for determining shielded area of virtual object and terminal equipment
US11263803B2 (en) Virtual reality scene rendering method, apparatus and device
WO2021228031A1 (en) Rendering method, apparatus and system
WO2020248900A1 (en) Panoramic video processing method and apparatus, and storage medium
CN111882634B (en) Image rendering method, device, equipment and storage medium
US11727632B2 (en) Shader binding management in ray tracing
CN112840378A (en) Global lighting interacting with shared illumination contributions in path tracing
WO2022121653A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
WO2022089592A1 (en) Graphics rendering method and related device thereof
US11954830B2 (en) High dynamic range support for legacy applications
US11475549B1 (en) High dynamic range image generation from tone mapped standard dynamic range images
EP3956752B1 (en) Semantic-augmented artificial-reality experience
WO2022121654A1 (en) Transparency determination method and apparatus, and electronic device and storage medium
CN112528707A (en) Image processing method, device, equipment and storage medium
WO2022121652A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
US11836844B2 (en) Motion vector optimization for multiple refractive and reflective interfaces
Fu et al. Dynamic shadow rendering with shadow volume optimization
CN115715464A (en) Method and apparatus for occlusion handling techniques
CN114612603A (en) Method and device for determining transparency, electronic equipment and storage medium
CN112465692A (en) Image processing method, device, equipment and storage medium
CN114627231A (en) Method and device for determining transparency, electronic equipment and storage medium
US11887245B2 (en) Techniques for rendering signed distance functions
US11823318B2 (en) Techniques for interleaving textures
US20230267063A1 (en) Real-time latency measurements in streaming systems and applications
CN115457180A (en) Three-dimensional terrain gradient rendering method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21902361

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.11.2023)