CN108921778B - Method for generating star effect map - Google Patents

Method for generating star effect map Download PDF

Info

Publication number
CN108921778B
CN108921778B CN201810739039.3A CN201810739039A CN108921778B CN 108921778 B CN108921778 B CN 108921778B CN 201810739039 A CN201810739039 A CN 201810739039A CN 108921778 B CN108921778 B CN 108921778B
Authority
CN
China
Prior art keywords
dimensional
vertex
coordinates
coordinate
transformation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810739039.3A
Other languages
Chinese (zh)
Other versions
CN108921778A (en
Inventor
黄超
徐滢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Pinguo Technology Co Ltd
Original Assignee
Chengdu Pinguo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Pinguo Technology Co Ltd filed Critical Chengdu Pinguo Technology Co Ltd
Priority to CN201810739039.3A priority Critical patent/CN108921778B/en
Publication of CN108921778A publication Critical patent/CN108921778A/en
Application granted granted Critical
Publication of CN108921778B publication Critical patent/CN108921778B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/08Projecting images onto non-planar surfaces, e.g. geodetic screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method for generating a star effect picture, which belongs to the technical field of image processing and comprises the following steps: creating an OPENGL ES rendering environment; inputting an original two-dimensional image to be processed; drawing a planet effect graph on the original two-dimensional image by adopting a pre-programmed shader and calling an API (application program interface) of the OPENGL ES rendering environment; the shader is used for calculating the transformation texture coordinates of the original two-dimensional image, and the transformation texture coordinates are used for constructing the planet effect. The method for generating the planet effect graph is simple in user operation, and can generate the planet effect graph with a good effect by one key, so that the user experience is greatly improved.

Description

Method for generating star effect map
Technical Field
The invention relates to the technical field of image processing, in particular to a method for generating a star effect image.
Background
The star effect is similar to 360-degree panoramic photography, and is embodied by a circular picture composition, wherein pictures are connected end to end.
In the prior art, PS (Photoshop) is generally adopted to convert a common two-dimensional image into a star effect map, and the specific operations are as follows: 1. opening an image to be converted under PS, and adjusting the image into a square; 2. turning on a distortion function in a filter in the PS, distorting a square image by using polar coordinates, and adjusting parameters to obtain a planet effect graph; 3. and modifying the connecting edges of the planet effect graphs so as to ensure that the seams of the planet effect graphs are smooth and natural in transition.
From the above steps, the generation of the star effect map by using the prior art requires the user to have a basic PS experience. Also, in the step of adjusting the image to be square in PS, image distortion may be caused, for example, the image is compressed or stretched; in addition, the prior art has fewer image adjusting modes and single effect. Meanwhile, the smoothing processing at the image seam is more complex for the ordinary user. The defects all cause that a user cannot conveniently obtain a good star effect picture.
Disclosure of Invention
The invention aims to provide a method for generating a planet effect picture, which is simple in user operation and can quickly generate the planet effect picture with better effect, thereby greatly improving the user experience.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a method for generating a star effect map comprises the following steps: creating an OPENGL ES rendering environment; inputting an original two-dimensional image to be processed; drawing a planet effect graph on the original two-dimensional image by adopting a pre-programmed shader and calling an API (application program interface) of the OPENGL ES rendering environment; the shader is used for calculating the transformation texture coordinates of the original two-dimensional image, and the transformation texture coordinates are used for constructing the planet effect.
Preferably, the method for the shader to calculate the transformed texture coordinates of the original two-dimensional image comprises: acquiring texture coordinates of the original two-dimensional image; converting the texture coordinates into 3D vertex coordinates in a preset virtual sphere; and mapping the 3D vertex coordinates back to a two-dimensional plane to obtain the transformation texture coordinates.
Preferably, the method for converting the texture coordinates into the 3D vertex coordinates in the preset virtual sphere is as follows:
Figure BDA0001722764660000021
wherein, (X, Y) is texture coordinates, and (X, Y, z) is 3D vertex coordinates;
the method for mapping the 3D vertex coordinates back to a two-dimensional plane and obtaining the transformation texture coordinates comprises the following steps: converting the 3D vertex coordinates into a coordinate representation mode in a spherical coordinate system, and acquiring longitude and latitude of the 3D vertex coordinates in the spherical coordinate system; and normalizing the longitude and the latitude of the 3D vertex coordinate in a spherical coordinate system to obtain the transformation texture coordinate.
Preferably, the method for obtaining the transformed texture coordinate by normalizing the longitude and latitude of the 3D vertex coordinate in the spherical coordinate system includes:
u=φ/2π,v=θ/π
wherein φ is the longitude of the 3D vertex coordinate in the spherical coordinate system, θ is the latitude of the 3D vertex coordinate in the spherical coordinate system, and (u, v) are transformation texture coordinates.
Further, after the converting the texture coordinates into the 3D vertex coordinates in the preset virtual sphere, the method further includes: performing three-dimensional rotation transformation on the 3D vertex coordinate by adopting a preset three-dimensional transformation matrix to obtain a second 3D vertex coordinate; the three-dimensional transformation matrix includes: a three-dimensional transformation matrix in the X direction, a three-dimensional transformation matrix in the Y direction and a three-dimensional transformation matrix in the Z direction; and mapping the second 3D vertex coordinate back to a two-dimensional plane to obtain the transformation texture coordinate.
Further, after the performing three-dimensional rotation transformation on the 3D vertex coordinates by using a preset three-dimensional transformation matrix to obtain second 3D vertex coordinates, the method further includes: zooming the second 3D vertex coordinate to obtain a third 3D vertex coordinate; and mapping the third 3D vertex coordinate back to a two-dimensional plane to obtain the transformation texture coordinate.
Preferably, the method for performing three-dimensional rotation transformation on the 3D vertex coordinates by using a preset three-dimensional transformation matrix to obtain second 3D vertex coordinates includes: performing rotation transformation on the X direction of the 3D vertex coordinate by adopting the three-dimensional transformation matrix in the X direction, performing rotation transformation on the Y direction of the 3D vertex coordinate by adopting the three-dimensional transformation matrix in the Y direction, and performing rotation transformation on the Z direction of the 3D vertex coordinate by adopting the three-dimensional transformation matrix in the Z direction to obtain a second 3D vertex coordinate;
or multiplying the three-dimensional transformation matrix in the X direction, the three-dimensional transformation matrix in the Y direction and the three-dimensional transformation matrix in the Z direction to obtain a second three-dimensional transformation matrix; and multiplying the 3D vertex coordinates by the second three-dimensional transformation matrix to obtain second 3D vertex coordinates.
Further, the shader is also configured to stretch or compress the transformed texture coordinates.
Further, the shader is also used for smoothing the image seam of the star effect image to obtain a smooth star effect image.
Preferably, smoothing is performed on the image seams of the star effect image by adopting an Alpha transparent mixing technology to obtain a smooth star effect image.
According to the method for generating the star effect image, the pre-programmed shader is adopted to obtain the transformation texture coordinates of the original two-dimensional image, the star effect image is constructed in the set rendering environment through the transformation texture coordinates, and the star effect image of the original two-dimensional image is generated in a one-key mode in a program direct compiling mode. Compared with the prior art of generating the planet effect graph by adopting the PS, the user can obtain the planet effect by directly connecting the shader without carrying out complex operation, and the user experience is greatly improved. In addition, in the writing algorithm of the shader, the transformation texture coordinates of the original two-dimensional image are obtained in a mode that 2D texture coordinates of the original two-dimensional image are converted into 3D vertex coordinates in a virtual sphere and then the 3D vertex coordinates are mapped back to a two-dimensional plane, the method is a virtual three-dimensional reconstruction technology, the same effect as that of rendering a three-dimensional object by using a real depth buffer zone can be obtained, and the calculation and processing processes are simpler and more efficient. In addition, the whole process of obtaining the star effect graph is completed in one shader, and extension and expansion can be conveniently carried out on other places needing the function, so that the development efficiency is greatly improved.
Drawings
FIG. 1 is a flow chart of a method of an embodiment of the present invention;
fig. 2 is a schematic diagram of a spherical coordinate system according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings.
FIG. 1 is a flowchart of a method according to an embodiment of the present invention, including:
step 101, creating an OPENGL ES rendering environment;
in this embodiment, in order to achieve a real-time rendering effect on a mobile phone or other mobile terminals, a user obtains an optimal experience, and the GPU is directly used for data processing. Of course, the CPU may be used to process data where real-time rendering is not required. The processing by the GPU is based on using OPENGL ES as a rendering API (Application Programming Interface). The method specifically comprises the following two creating processes:
an OPENGL ES environment is created. The environment is a rendering environment, different platforms have different implementation processes, but the final upper layer interfaces all need a rendering environment to render images, and the environment is called OPENGL ES context environment. Whether the API call of OPENGL ES is successful depends on the environment, and if the environment is incorrect, the rendering process cannot be completed smoothly.
A frame buffer is created. This buffer is a buffer required for image rendering (i.e. rendering a star effect map). The buffer can be a depth buffer, a mask buffer, a color buffer, etc. The invention mainly performs image processing, so a color buffer is mounted by directly rendering to texture.
Step 102, inputting an original two-dimensional image to be processed;
103, drawing a star effect graph for the original two-dimensional image by adopting a pre-programmed shader and calling an API (application program interface) of the OPENGL ES rendering environment; the shader is used for calculating the transformation texture coordinates of the original two-dimensional image, and the transformation texture coordinates are used for constructing the planet effect.
In this example, the general process is as follows: first, a texture image of an original two-dimensional image is generated, the texture image is input into a GPU as an input image to be processed, and meanwhile, another texture image is generated as an output image by using null data. And then, preparing vertex coordinates and texture coordinates of the original two-dimensional image, compiling a pre-written shader program, setting each parameter in a shader, submitting the parameters to a GPU for operation, rendering the original two-dimensional image, and finally obtaining a star effect image.
The most important step in the present invention is the writing of the shader, which is the key of the algorithm involved in this embodiment. Specifically, the method for the shader to obtain the transformation parameters of the original two-dimensional image includes:
acquiring texture coordinates of the original two-dimensional image;
after the previous rendering process is prepared, how to render the image, i.e. what effect is desired, is the key to the writing of the fragment shader in the program. The vertex shader has no excessive special operation, the vertex coordinates and the texture coordinates are mainly subjected to interpolation operation, and the interpolated vertex coordinates and texture coordinates are used in the subsequent fragment shader. The "vertex coordinates" are the vertex coordinates required for drawing an image, and define what shape this primitive is, and the coordinate values given in this embodiment define a quadrilateral shape. The "texture coordinates" are a set of floating point numbers 0 to 1 indicating where the image is sampled from the texture to determine the image display in this quadrilateral-shaped region, and their values are input by their own specification.
The vertex coordinate data we prepare here are: [ -1.0,1.0, -1.0, -1.0,1.0,1.0,1.0, -1.0]
V0:-1.0,1.0
V1:-1.0,-1.0
V2:1.0,1.0
V3:1.0,-1.0
The texture coordinates are [0.0,1.0,0.0, 1.0,0.0]
UV0:0.0,1.0
UV1:0.0,0.0
UV2:1.0,1.0
UV3:1.0,0.0
The transformation result of a two-dimensional image can be regarded as a process of calculating texture coordinates of the image, because the final image effect is a result of processing each pixel in the two-dimensional image by a fragment shader, that is, different texture coordinates are used to sample different pixels at different positions in the original two-dimensional image to form a final effect image.
In the embodiment, the input texture coordinate has a value range of 0 to 1, and for the convenience of calculation, the default texture coordinate (0.5 ) center point is converted to the (0.0 ) point, i.e. the u, v value is simultaneously subtracted by 0.5, i.e. the range is converted to be between-0.5 and 0.5. In the subsequent steps, a virtual sphere is constructed, the diameter of the sphere is 1, and therefore, the transformed texture coordinate range just accords with the coordinate range of the virtual sphere.
Converting the texture coordinates into 3D vertex coordinates in a virtual sphere;
a "virtual sphere" is distinguished from a true three-dimensional space in that a virtual sphere is modeled in a two-dimensional space and not a true three-dimensional space. The true three-dimensional space also requires the aforementioned need for a depth buffer added to the frame buffer to represent the Z-axis, i.e., depth information. Since we do not render images in true three-dimensional space, to convert texture coordinates to 3D vertex coordinates in a virtual sphere, we use the following formula for the conversion:
Figure BDA0001722764660000081
wherein (X, Y) are texture coordinates and (X, Y, z) are 3D vertex coordinates.
The method is a mapping relation, namely texture coordinates of an original two-dimensional image are converted into three components of x, y and z, so that mapping coordinates of a three-dimensional sphere can be obtained on a two-dimensional plane.
In order to make the final presented image effect have diversity, after converting the texture coordinates into the 3D vertex coordinates in the virtual sphere, the method further includes: performing three-dimensional rotation transformation on the 3D vertex coordinate by adopting a preset three-dimensional transformation matrix to obtain a second 3D vertex coordinate; the three-dimensional transformation matrix includes:
three-dimensional transformation matrix in X direction: [1.0,0.0, cos (X), sin (X), 0.0, -sin (X), cos (X) ], transforming for the X axis;
three-dimensional transformation matrix in Y direction: [ cos (Y), 0.0, -sin (Y), 0.0,1.0,0.0, sin (Y), 0.0, cos (Y) ], transforming for the Y axis;
three-dimensional transformation matrix in Z direction: [ cos (Z), sin (Z), 0.0, -sin (Z), cos (Z), 0.0,1.0], transformation is performed for the Z axis;
specifically, the three-dimensional transformation matrix in the X direction is adopted to perform rotational transformation on the X direction of the 3D vertex coordinates, the three-dimensional transformation matrix in the Y direction is adopted to perform rotational transformation on the Y direction of the 3D vertex coordinates, and the three-dimensional transformation matrix in the Z direction is adopted to perform rotational transformation on the Z direction of the 3D vertex coordinates, so as to obtain second 3D vertex coordinates; x, y, and z in the matrix represent the radian of rotation, and in the case of the unit of degrees, the matrix needs to be converted into radian units in advance for calculation.
In order to avoid transmitting the three matrixes into a GPU for calculation, the three-dimensional transformation matrix in the X direction, the three-dimensional transformation matrix in the Y direction and the three-dimensional transformation matrix in the Z direction can be multiplied to obtain a second three-dimensional transformation matrix; the 3D vertex coordinates are then multiplied by the second three-dimensional transformation matrix to obtain second 3D vertex coordinates. I.e. all transformations of these three matrices need eventually only be represented by the second three-dimensional transformation matrix, the transformation matrix that is eventually passed to the GPU needs only one, and this matrix represents the rotation in three directions.
In order to make the finally presented image effect have diversity, after the three-dimensional rotation transformation is performed on the 3D vertex coordinates by using a preset three-dimensional transformation matrix to obtain second 3D vertex coordinates, the method further includes: and zooming the second 3D vertex coordinate to obtain a third 3D vertex coordinate. The scaling method may add a scaling matrix after the second 3D vertex coordinate, but for the convenience of implementation, this embodiment directly adds a variable in the fragment shader, and the variable is multiplied by the texture coordinate after the texture coordinate transformation obtains the (0, 0) point as the center point, and the result is assigned to the texture coordinate, so as to achieve the same scaling effect.
It is necessary to make clear whether the matrix multiplication is a left-hand multiplication or a right-hand multiplication, and different multiplication orders result in different results. In this embodiment, the above matrices are all right-multiplied.
And step three, mapping the 3D vertex coordinates back to a two-dimensional plane to obtain the transformation texture coordinates.
The mapping method of the third step is as follows: converting the 3D vertex coordinates into a coordinate representation mode in a spherical coordinate system, and acquiring the longitude and latitude of the 3D vertex coordinates; and carrying out normalization processing on the longitude and the latitude of the 3D vertex coordinate to obtain the transformation texture coordinate. The specific process is as follows:
as shown in fig. 2, the spherical coordinate system is usually expressed by (r, θ, Φ) as radial distance, zenith angle, and azimuth angle. In this embodiment, we will use this notation of the tagging convention. Here, it is necessary to calculate what the three values are respectively. In the foregoing, we have already calculated the three-dimensional coordinates of the virtual sphere in the rectangular coordinate system (x, y, z), and now only need to calculate the coordinates of the spherical coordinate system corresponding to the three-dimensional coordinates. The conversion relation between the rectangular coordinate system (x, y, z) and the spherical coordinate system (r, theta, phi) is:
Figure BDA0001722764660000101
Figure BDA0001722764660000102
Figure BDA0001722764660000103
wherein r represents the radius of the sphere, θ represents the latitude, and φ represents the longitude, i.e. calculating the inverse trigonometric function can solve the corresponding radian.
And substituting the virtual three-dimensional coordinates of x, y and z into the formula to calculate theta latitude and phi longitude. The value range of the geographical latitude is 0 degree to 90 degrees N (north latitude, recorded as N) and 0 degree to 90 degrees S (south latitude, recorded as S). The geographic longitude ranges from 0 degree to 180 degree E (east longitude, recorded as E) 0 degree to 180 degree W (west longitude, recorded as W). In mathematics, however, the latitude can be considered to be in a range from 0 to pi in numerical value in the calculation of the current time, and the longitude can be considered to be in a range from 0 to 2 pi in numerical value, so that the final calculation of the transformed texture coordinate, namely the normalization of the longitude and the latitude, can be simplified.
Since the domain of the arctan function is the real number range of R, the range of values is (-pi/2, pi/2), the domain of the arccos function is [ -1,1], the range of values is [0, pi ], the calculated longitude value phi may be negative, which brings trouble to the subsequent normalization process, so a determination is needed here: when the phi value is less than 0, the calculated phi value is added with 2 pi, and the positions represented by the phi value are the same and only change from negative values to positive values. The value of phi changes to (0, 2 pi) after this operation.
Since the range of the texture coordinate value is between 0 and 1, the last step is the normalization processing of the texture coordinate value, and here, the final corresponding pixel sampling point position, namely the corresponding transformed texture coordinate, can be calculated only by dividing the longitude value phi by 2 pi and dividing the latitude value theta by pi. Namely:
u=φ/2π,v=θ/π
where φ is the longitude of the 3D vertex coordinates in the sphere coordinate system, θ is the latitude of the 3D vertex coordinates in the sphere coordinate system, and (u, v) are texture coordinates after transformation.
The (u, v) is the final transformed texture coordinate after the virtual sphere mapping and the spherical coordinate system representation mapping, i.e. the final star effect graph with the transformation effect can be sampled at the corresponding position of the pixel of the two-dimensional image by using the finally calculated (u, v) value.
Through the above calculation, we have obtained the finally available star effect map, but for the diversity and interest of the effect, further processing can be performed on the basis, that is, the (u, v) value which has been calculated is used, and then some specific transformation is performed, such as stretching the (u, v) value or compressing the (u, v) value, so that the presented effect map has diversity.
And finally, a step of processing is needed, namely the obtained star effect image is very hard at the image joint, namely the edge of the two-dimensional image and the edge of the two-dimensional image are not in good transition after transformation, and the image joint of the star effect image is smoothed by adopting an Alpha transparent mixing technology to obtain a smooth star effect image.
Specifically, a smooth transition value, such as the blu value, is set and then the u value is determined. If the u value is greater than the blu value, no processing is done; if the u value is less than the blu value, then the 1.0-u value is used as the reverse edge image data, which is then used again for Alpha blending with the original data. The mixing formula is as follows:
src*alpha+dst*(1.0-alpha),
the alpha value here can be used to linearly obtain a smooth transition value between 0 and 1 using u/blu. src represents the source image, dst represents the target image, i.e. the source image and the target image are subjected to a transparency blending operation, and blu represents the blending degree, which is equivalent to the alpha value in the formula. As can be seen from the above formula, if alpha is equal to 0, then only the target image is calculated from this formula; if alpha is equal to 1, then only the source image is calculated by this formula; if alpha is a number between 0 and 1, a blending effect of the source image and the target image may be obtained.
After the shaders are compiled, finally, the needed final star effect graph can be drawn only by compiling the shaders, connecting the shaders and calling the rendering API for direct rendering.
According to the method for generating the star effect image, which is provided by the embodiment of the invention, a pre-programmed shader is adopted for acquiring textures, texture coordinates, vertex coordinates and various transformation parameters of the original two-dimensional image; the shader is used for constructing a planet effect, namely a planet effect graph of an original two-dimensional image is generated in a one-key mode by adopting a program direct compiling mode. Compared with the prior art of generating the planet effect graph by adopting the PS, a user can obtain the planet effect by directly using the shader program without carrying out complex operation, and the user experience is greatly improved. In addition, in the writing algorithm of the shader, the texture coordinates of the original two-dimensional image are converted into the 3D vertex coordinates in the virtual sphere, and then the 3D vertex coordinates are mapped to the two-dimensional plane in a sphere coordinate system representation mode to obtain the converted texture coordinates of the original two-dimensional image, which is a virtual three-dimensional reconstruction technology, the same effect as that of rendering a three-dimensional object by using a real depth buffer zone can be obtained, but the calculation and processing process is simpler and more efficient. In the shader, by adjusting parameters, the display effect of the star effect graph can be adjusted, for example, movement, scaling, rotation, stretching, compression, smoothing and the like in the three directions of X, Y and Z, so that the display effect of the star effect graph is more various, and various requirements of users are met. In addition, the whole process of obtaining the star effect diagram is completed in one shader, and other places needing the function can be conveniently extended and expanded, so that the development efficiency is greatly improved.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (7)

1. A method for generating a star effect map, comprising:
creating an OPENGL ES rendering environment;
inputting an original two-dimensional image to be processed;
drawing a planet effect graph on the original two-dimensional image by adopting a pre-programmed shader and calling an API (application program interface) of the OPENGL ES rendering environment; the shader is used for calculating the transformation texture coordinates of the original two-dimensional image, and the transformation texture coordinates are used for constructing a planet effect;
the method for the shader to compute transformed texture coordinates of the original two-dimensional image comprises the following steps:
acquiring texture coordinates of the original two-dimensional image;
converting the texture coordinates into 3D vertex coordinates in a preset virtual sphere;
mapping the 3D vertex coordinates back to a two-dimensional plane to obtain the transformation texture coordinates;
the method for converting the texture coordinates into 3D vertex coordinates in a preset virtual sphere is as follows:
Figure FDA0003923373610000011
wherein, (X, Y) is texture coordinates, and (X, Y, z) is 3D vertex coordinates;
the method for mapping the 3D vertex coordinates back to a two-dimensional plane to obtain the transformed texture coordinates comprises the following steps:
converting the 3D vertex coordinates into a coordinate representation mode in a spherical coordinate system, and acquiring longitude and latitude of the 3D vertex coordinates in the spherical coordinate system;
carrying out normalization processing on the longitude and the latitude of the 3D vertex coordinate in a spherical coordinate system to obtain the transformation texture coordinate;
the method for obtaining the transformed texture coordinate by normalizing the longitude and the latitude of the 3D vertex coordinate in the spherical coordinate system comprises the following steps:
u=φ/2π,v=θ/π
wherein φ is the longitude of the 3D vertex coordinate in the spherical coordinate system, θ is the latitude of the 3D vertex coordinate in the spherical coordinate system, and (u, v) are transformation texture coordinates.
2. The method for generating a star effect map according to claim 1, wherein after the converting the texture coordinates into 3D vertex coordinates in a preset virtual sphere, the method further comprises:
performing three-dimensional rotation transformation on the 3D vertex coordinate by adopting a preset three-dimensional transformation matrix to obtain a second 3D vertex coordinate; the three-dimensional transformation matrix includes: a three-dimensional transformation matrix in the X direction, a three-dimensional transformation matrix in the Y direction and a three-dimensional transformation matrix in the Z direction;
and mapping the second 3D vertex coordinate back to a two-dimensional plane to obtain the transformation texture coordinate.
3. The method for generating a planet effect map according to claim 2, wherein after the performing three-dimensional rotation transformation on the 3D vertex coordinates by using a preset three-dimensional transformation matrix to obtain second 3D vertex coordinates, the method further comprises:
zooming the second 3D vertex coordinate to obtain a third 3D vertex coordinate;
and mapping the third 3D vertex coordinate back to a two-dimensional plane to obtain the transformation texture coordinate.
4. The method for generating a planet effect map according to claim 2, wherein the method for performing three-dimensional rotation transformation on the 3D vertex coordinates by using a preset three-dimensional transformation matrix to obtain the second 3D vertex coordinates comprises:
performing rotation transformation on the X direction of the 3D vertex coordinate by adopting the three-dimensional transformation matrix in the X direction, performing rotation transformation on the Y direction of the 3D vertex coordinate by adopting the three-dimensional transformation matrix in the Y direction, and performing rotation transformation on the Z direction of the 3D vertex coordinate by adopting the three-dimensional transformation matrix in the Z direction to obtain a second 3D vertex coordinate;
alternatively, the first and second liquid crystal display panels may be,
multiplying the three-dimensional transformation matrix in the X direction, the three-dimensional transformation matrix in the Y direction and the three-dimensional transformation matrix in the Z direction to obtain a second three-dimensional transformation matrix; and multiplying the 3D vertex coordinates by the second three-dimensional transformation matrix to obtain second 3D vertex coordinates.
5. The method according to claim 1, wherein the shader is further configured to stretch or compress the transformed texture coordinates.
6. The method according to claim 1, wherein the shader is further configured to smooth a seam of the image of the star effect map to obtain a smooth star effect map.
7. The method for generating the star effect map according to claim 6, wherein an Alpha transparent blending technology is adopted to smooth the image seams of the star effect map to obtain a smooth star effect map.
CN201810739039.3A 2018-07-06 2018-07-06 Method for generating star effect map Active CN108921778B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810739039.3A CN108921778B (en) 2018-07-06 2018-07-06 Method for generating star effect map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810739039.3A CN108921778B (en) 2018-07-06 2018-07-06 Method for generating star effect map

Publications (2)

Publication Number Publication Date
CN108921778A CN108921778A (en) 2018-11-30
CN108921778B true CN108921778B (en) 2022-12-30

Family

ID=64423250

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810739039.3A Active CN108921778B (en) 2018-07-06 2018-07-06 Method for generating star effect map

Country Status (1)

Country Link
CN (1) CN108921778B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110570482B (en) * 2019-08-07 2022-12-09 一诺仪器(中国)有限公司 Constellation diagram drawing method and system and spectrum analyzer
CN111311716B (en) * 2020-02-27 2023-05-12 Oppo广东移动通信有限公司 Animation playing method, device, terminal equipment and storage medium

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184236A (en) * 2002-12-03 2004-07-02 Sumitomo Rubber Ind Ltd Measuring method for rotational characteristics and flight characteristics of spherical body
CN102903139A (en) * 2012-09-07 2013-01-30 罗健欣 Accelerated rendering method for contours
CN103021013A (en) * 2012-11-28 2013-04-03 无锡羿飞科技有限公司 High-efficiency processing method for spherical display and rotary output image of projector
CN104637089A (en) * 2015-02-15 2015-05-20 腾讯科技(深圳)有限公司 Three-dimensional model data processing method and device
CN105701828A (en) * 2016-01-14 2016-06-22 广州视睿电子科技有限公司 Image-processing method and device
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal
CN106101741A (en) * 2016-07-26 2016-11-09 武汉斗鱼网络科技有限公司 Internet video live broadcasting platform is watched the method and system of panoramic video
CN106210859A (en) * 2016-08-11 2016-12-07 合网络技术(北京)有限公司 Panoramic video rendering intent and device
CN106355634A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Sun simulating method and device
CN106710003A (en) * 2017-01-09 2017-05-24 成都品果科技有限公司 Three-dimensional photographing method and system based on OpenGL ES (Open Graphics Library for Embedded System)
CN106851244A (en) * 2017-01-10 2017-06-13 北京阿吉比科技有限公司 The method and system of 3D panoramic videos are watched based on internet video live broadcasting platform
CN107093207A (en) * 2017-04-12 2017-08-25 武汉大学 A kind of dynamic and visual method of the natural gas leaking diffusion based on GPGPU
CN107146274A (en) * 2017-05-05 2017-09-08 上海兆芯集成电路有限公司 Image data processing system, texture mapping compression and the method for producing panoramic video
CN107248193A (en) * 2017-05-22 2017-10-13 北京红马传媒文化发展有限公司 The method, system and device that two dimensional surface is switched over virtual reality scenario
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
JP2018026989A (en) * 2016-08-13 2018-02-15 英文 久保田 Method for increasing and decreasing gravity on planet
CN107945274A (en) * 2017-12-26 2018-04-20 苏州蜗牛数字科技股份有限公司 A kind of crater terrain generation method and device based on fertile sharp noise
CN108154553A (en) * 2018-01-04 2018-06-12 中测新图(北京)遥感技术有限责任公司 The seamless integration method and device of a kind of threedimensional model and monitor video

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7909696B2 (en) * 2001-08-09 2011-03-22 Igt Game interaction in 3-D gaming environments
US20140085295A1 (en) * 2012-09-21 2014-03-27 Tamaggo Inc. Direct environmental mapping method and system
CN103605883A (en) * 2013-11-01 2014-02-26 中国人民解放军信息工程大学 Space-air-earth-integrated situation expression engine and engine viewpoint cross-scale seamless switching method
CN103544677B (en) * 2013-11-01 2017-03-01 中国人民解放军信息工程大学 Space-air-ground integration situation expression engine
GB2524960B (en) * 2014-04-04 2019-05-15 Imagineer Systems Ltd Processing of digital motion images
US10102610B2 (en) * 2016-04-05 2018-10-16 Qualcomm Incorporated Dual fisheye images stitching for spherical video

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184236A (en) * 2002-12-03 2004-07-02 Sumitomo Rubber Ind Ltd Measuring method for rotational characteristics and flight characteristics of spherical body
CN102903139A (en) * 2012-09-07 2013-01-30 罗健欣 Accelerated rendering method for contours
CN103021013A (en) * 2012-11-28 2013-04-03 无锡羿飞科技有限公司 High-efficiency processing method for spherical display and rotary output image of projector
CN104637089A (en) * 2015-02-15 2015-05-20 腾讯科技(深圳)有限公司 Three-dimensional model data processing method and device
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal
CN105701828A (en) * 2016-01-14 2016-06-22 广州视睿电子科技有限公司 Image-processing method and device
CN106101741A (en) * 2016-07-26 2016-11-09 武汉斗鱼网络科技有限公司 Internet video live broadcasting platform is watched the method and system of panoramic video
CN106210859A (en) * 2016-08-11 2016-12-07 合网络技术(北京)有限公司 Panoramic video rendering intent and device
JP2018026989A (en) * 2016-08-13 2018-02-15 英文 久保田 Method for increasing and decreasing gravity on planet
CN106355634A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Sun simulating method and device
CN106710003A (en) * 2017-01-09 2017-05-24 成都品果科技有限公司 Three-dimensional photographing method and system based on OpenGL ES (Open Graphics Library for Embedded System)
CN106851244A (en) * 2017-01-10 2017-06-13 北京阿吉比科技有限公司 The method and system of 3D panoramic videos are watched based on internet video live broadcasting platform
CN107093207A (en) * 2017-04-12 2017-08-25 武汉大学 A kind of dynamic and visual method of the natural gas leaking diffusion based on GPGPU
CN107146274A (en) * 2017-05-05 2017-09-08 上海兆芯集成电路有限公司 Image data processing system, texture mapping compression and the method for producing panoramic video
CN107248193A (en) * 2017-05-22 2017-10-13 北京红马传媒文化发展有限公司 The method, system and device that two dimensional surface is switched over virtual reality scenario
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
CN107945274A (en) * 2017-12-26 2018-04-20 苏州蜗牛数字科技股份有限公司 A kind of crater terrain generation method and device based on fertile sharp noise
CN108154553A (en) * 2018-01-04 2018-06-12 中测新图(北京)遥感技术有限责任公司 The seamless integration method and device of a kind of threedimensional model and monitor video

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ActionScript3.0实现3D星球绕转效果;杨剑雄;《电脑知识与技术》;20100505;第6卷(第13期);3558+3560 *
三维地理空间数据的典型图层可视化研究与实现;王雨霞;《中国优秀硕士学位论文全文数据库 基础科学辑》;20170215(第2期);A008-89 *

Also Published As

Publication number Publication date
CN108921778A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
US10621767B2 (en) Fisheye image stitching for movable cameras
US10275928B2 (en) Dual fisheye image stitching for spherical image content
TWI334581B (en) Texture engine, graphics processing unit and texture processing method thereof
CN106558017B (en) Spherical display image processing method and system
WO2021249091A1 (en) Image processing method and apparatus, computer storage medium, and electronic device
KR101359011B1 (en) 3-dimensional visualization system for displaying earth environment image
CN108921778B (en) Method for generating star effect map
CN114742931A (en) Method and device for rendering image, electronic equipment and storage medium
CN114782612A (en) Image rendering method and device, electronic equipment and storage medium
CN111508058A (en) Method and device for three-dimensional reconstruction of image, storage medium and electronic equipment
CN113077541B (en) Virtual sky picture rendering method and related equipment
CN117274527A (en) Method for constructing three-dimensional visualization model data set of generator equipment
CN109658495B (en) Rendering method and device for ambient light shielding effect and electronic equipment
JP6223916B2 (en) Information processing apparatus, method, and program
CN116310041A (en) Rendering method and device of internal structure effect, electronic equipment and storage medium
CN109816761A (en) Figure conversion method, device, storage medium and electronic equipment
CN115018968A (en) Image rendering method and device, storage medium and electronic equipment
CN112862981B (en) Method and apparatus for presenting a virtual representation, computer device and storage medium
KR100603134B1 (en) Method and apparatus for 3 dimension rendering processing using the monochromatic lighting
US6188409B1 (en) 3D graphics device
CN113470154B (en) Image processing method, device, electronic equipment and storage medium
CN110728619B (en) Panoramic image stitching rendering method and device
CN116993877A (en) Method, device and storage medium for simulating special effect of object drifting
CN115019020A (en) Model shadow generation method and device, readable storage medium and electronic equipment
CN114612621A (en) Panorama generation method and system based on three-dimensional tilt model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant