CN108921778A - A kind of celestial body effect drawing generating method - Google Patents

A kind of celestial body effect drawing generating method Download PDF

Info

Publication number
CN108921778A
CN108921778A CN201810739039.3A CN201810739039A CN108921778A CN 108921778 A CN108921778 A CN 108921778A CN 201810739039 A CN201810739039 A CN 201810739039A CN 108921778 A CN108921778 A CN 108921778A
Authority
CN
China
Prior art keywords
coordinate
celestial body
body effect
apex coordinate
transformation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810739039.3A
Other languages
Chinese (zh)
Other versions
CN108921778B (en
Inventor
黄超
徐滢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Pinguo Technology Co Ltd
Original Assignee
Chengdu Pinguo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Pinguo Technology Co Ltd filed Critical Chengdu Pinguo Technology Co Ltd
Priority to CN201810739039.3A priority Critical patent/CN108921778B/en
Publication of CN108921778A publication Critical patent/CN108921778A/en
Application granted granted Critical
Publication of CN108921778B publication Critical patent/CN108921778B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T3/08
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The present invention discloses a kind of celestial body effect drawing generating method, belongs to technical field of image processing, including:Create OPENGL ES rendering contexts;Input original two dimensional image to be processed;Using the tinter write in advance, and the API of the OPENGL ES rendering contexts is called to go out celestial body effect picture to the original two dimensional Image Rendering;The tinter is used to calculate the transformation texture coordinate of the original two dimensional image, and the transformation texture coordinate is for constructing celestial body effect.Celestial body effect drawing generating method provided by the invention, user's operation is simple, can a key generate the preferable celestial body effect picture of effect, to greatly improve user experience.

Description

A kind of celestial body effect drawing generating method
Technical field
The present invention relates to technical field of image processing more particularly to a kind of celestial body effect drawing generating methods.
Background technique
Celestial body effect is a kind of effect taken pictures similar to 360 degree of panoramas, and what is embodied is a kind of round film composition, Picture joins end to end.
The prior art generally uses PS (Photoshop) that an ordinary two-dimensional image is converted into celestial body effect picture, tool Gymnastics is made as follows:1, image to be converted is opened at PS, and is adjusted image and be square;2, the torsion in PS inside filter is opened Qu Gongneng distorts square-shaped image, and adjusting parameter using polar coordinates, obtains celestial body effect picture;3, celestial body effect picture is modified Edge is connected, so that the seam crossing transitions smooth of celestial body effect picture, nature.
User is needed to have basic PS experience it is found that generating celestial body effect picture using the prior art by above step.And And adjusted in the step of image is square at PS, it may result in image fault, for example, image is made to be compressed or be stretched; In addition, the prior art is also less to the adjustment mode of image, single effect.The smoothing processing of image seam crossing is used common simultaneously It is even more complicated for operation for family.Disadvantage mentioned above makes user that can not easily obtain preferable celestial body effect picture.
Summary of the invention
The present invention is intended to provide a kind of celestial body effect drawing generating method, user's operation is simple, and can be quickly generated effect The preferable celestial body effect picture of fruit, to greatly improve user experience.
In order to achieve the above objectives, the technical solution adopted by the present invention is as follows:
A kind of celestial body effect drawing generating method, including:Create OPENGL ES rendering contexts;Input to be processed original two Tie up image;Using the tinter write in advance, and call the API of the OPENGL ES rendering contexts to the original two dimensional figure As drawing out celestial body effect picture;The tinter is used to calculate the transformation texture coordinate of the original two dimensional image, the transformation Texture coordinate is for constructing celestial body effect.
Preferably, the method for the transformation texture coordinate of original two dimensional image described in the shader computations includes:Obtain institute State the texture coordinate of original two dimensional image;The texture coordinate is converted into the 3D apex coordinate in preset virtual sphere; The 3D apex coordinate is mapped back into two-dimensional surface, obtains the transformation texture coordinate.
Preferably, the method for the 3D apex coordinate texture coordinate being converted into preset virtual sphere For:
Wherein, (X, Y) is texture coordinate, and (x, y, z) is 3D apex coordinate;
Described that the 3D apex coordinate is mapped back two-dimensional surface, the method for obtaining the transformation texture coordinate is:By institute It states 3D apex coordinate and is converted to coordinate representation mode in spherical coordinate system, obtain warp of the 3D apex coordinate in spherical coordinate system Degree and latitude;Longitude of the 3D apex coordinate in spherical coordinate system and latitude are normalized, the transformation is obtained Texture coordinate.
Preferably, the longitude and latitude by the 3D apex coordinate in spherical coordinate system is normalized, and obtains Take it is described transformation texture coordinate method be:
U=φ/2 π, v=θ/π
Wherein, φ is longitude of the 3D apex coordinate in spherical coordinate system, and θ is the 3D apex coordinate in spherical coordinate system In latitude, (u, v) be transformation texture coordinate.
Further, it is described by the texture coordinate be converted into the 3D apex coordinate in preset virtual sphere it Afterwards, further include:Three-dimensional rotation transformation is carried out to the 3D apex coordinate using preset three-dimension varying matrix, obtains the top the 2nd 3D Point coordinate;The three-dimension varying matrix includes:The three-dimension varying matrix of X-direction, the three-dimension varying matrix of Y-direction, the three of Z-direction Tie up transformation matrix;The 2nd 3D apex coordinate is mapped back into two-dimensional surface, obtains the transformation texture coordinate.
Further, three-dimensional rotation change is carried out to the 3D apex coordinate using preset three-dimension varying matrix described It changes, after obtaining the 2nd 3D apex coordinate, further includes:The 2nd 3D apex coordinate is zoomed in and out, the 3rd vertex 3D is obtained Coordinate;The 3rd 3D apex coordinate is mapped back into two-dimensional surface, obtains the transformation texture coordinate.
Preferably, described that three-dimensional rotation transformation is carried out to the 3D apex coordinate using preset three-dimension varying matrix, it obtains The method for taking the 2nd 3D apex coordinate includes:Using the three-dimension varying matrix of the X-direction to the X-direction of the 3D apex coordinate Rotation transformation is carried out, rotation transformation is carried out to the Y-direction of the 3D apex coordinate using the three-dimension varying matrix of the Y-direction, Rotation transformation is carried out to the Z-direction of the 3D apex coordinate using the three-dimension varying matrix of the Z-direction, obtains the 2nd vertex 3D Coordinate;
Alternatively, by the three-dimension varying matrix of the X-direction, the three-dimension varying matrix of Y-direction, Z-direction three-dimension varying square Battle array is multiplied, and obtains the second three-dimension varying matrix;By the 3D apex coordinate and the second three-dimension varying matrix multiple, the is obtained Two 3D apex coordinates.
Further, the tinter is also used to that the transformation texture coordinate is stretched or compressed.
Further, the tinter is also used to be smoothed the image seam crossing of the celestial body effect picture, obtains The celestial body effect picture for making even sliding.
Preferably, it is smoothed using image seam crossing of the transparent hybrid technology of Alpha to the celestial body effect picture, Obtain smooth celestial body effect picture.
Celestial body effect drawing generating method provided in an embodiment of the present invention obtains original two dimensional using the tinter write in advance The transformation texture coordinate of image, and celestial body effect picture is constructed by the transformation texture coordinate in the rendering contexts of setting, i.e., A key generates the celestial body effect picture of original two dimensional image by the way of program direct compilation.Celestial body is generated using PS with existing The technology of effect picture is compared, and user does not need to carry out complicated operation, and being directly connected to coloring device can be obtained celestial body effect, greatly Ground improves user experience.Also, it in writing in algorithm for tinter, is converted using by the 2D texture coordinate of original two dimensional image At in virtual sphere 3D apex coordinate, 3D apex coordinate maps back to the mode of two-dimensional surface obtains original two dimensional image again Transformation texture coordinate, this is a kind of virtual three-dimensional reconstruction, it is available with using really carrying out wash with watercolours with depth buffer The same effect of three-dimension object is contaminated, but calculates and is but more simple and efficient with treatment process.In addition, the present invention obtains celestial body effect The whole process of figure is completed in a tinter, needs to be convenient to be extended and be expanded using the place of this function other Exhibition, substantially increases development efficiency.
Detailed description of the invention
Fig. 1 is the method flow diagram of the embodiment of the present invention;
Fig. 2 is the schematic diagram of spherical coordinate system in the embodiment of the present invention.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, below in conjunction with attached drawing, to the present invention into Row is further described.
Fig. 1 is the method flow diagram of the embodiment of the present invention, including:
Step 101, OPENGL ES rendering contexts are created;
In the present embodiment, in order to realize the effect of real-time rendering on mobile phone or other mobile terminals, obtain user Optimum experience is obtained, directly carries out data processing using GPU.Certainly, in the occasion for not needing real-time rendering, CPU can also be used Carry out data processing.It is to use OPENGL ES as rendering API (Application using the basis that GPU is handled Programming Interface, application programming interface).It is specifically divided into following two creation process:
Create OPENGL ES environment.This environment is a rendering contexts, and different platforms has different realization processes, But final high-level interface is required to a rendering contexts to render image, this environment is called OPENGL ES context environmental. The API Calls of OPENGL ES whether successfully depend on this environment, if environment it is incorrect will be unable to smoothly complete rendered Journey.
Create frame buffer zone.This buffer area is required buffering when carrying out Image Rendering (drawing out celestial body effect picture) Area.It this buffer area can carry depth buffer, masking-out buffer area, color buffer etc..What the present invention was substantially carried out is at image Reason, so come one color buffer of carry by the way of being directly rendered into texture.
Step 102, original two dimensional image to be processed is inputted;
Step 103, using the tinter write in advance, and call the API of the OPENGL ES rendering contexts to the original Beginning two dimensional image draws out celestial body effect picture;The tinter is used to calculate the transformation texture coordinate of the original two dimensional image, The transformation texture coordinate is for constructing celestial body effect.
In the present embodiment, treatment process substantially is as follows:The texture image for firstly generating original two dimensional image, by this line Reason image is input to as input picture to be gone to handle in GPU, while empty data being used to generate another width texture image as output Image.Later, the apex coordinate and texture coordinate of original two dimensional image are got out, the coloration program write in advance is compiled, if The parameters in tinter are set, is then committed in GPU and runs, original two dimensional image is rendered, celestial body is finally obtained Effect picture.
Most important step is writing for tinter in the present invention, is the key that the algorithm being related in the present embodiment.Tool Body, the method that the tinter obtains the transformation parameter of the original two dimensional image includes:
Step 1 obtains the texture coordinate of the original two dimensional image;
After getting out the rendering process of front, how image is rendered, that is, want to reach which type of effect, it is crucial Embody writing for piece member tinter in a program.Not excessive special operation inside vertex shader, mainly to top Point coordinate and texture coordinate carry out interpolation operation, and the apex coordinate and texture coordinate after being interpolated will be in subsequent first tinters Middle use." apex coordinate " is apex coordinate required for drawing an image, and defining this pel is what shape, this implementation The coordinate value provided in example defines a quadrangle form." texture coordinate " be one group 0 to 1 of floating-point values illustrate from Where texture carries out image sampling to determine that the image in this quadrangle form region shows that their value is all It is by oneself specified input.
Here our vertex coordinates datas of preparation are:[-1.0,1.0,-1.0,-1.0,1.0,1.0,1.0,-1.0]
V0:-1.0,1.0
V1:-1.0,-1.0
V2:1.0,1.0
V3:1.0,-1.0
Texture coordinate is:[0.0,1.0,0.0,0.0,1.0,1.0,1.0,0.0]
UV0:0.0,1.0
UV1:0.0,0.0
UV2:1.0,1.0
UV3:1.0,0.0
The transformation results of one width two dimensional image can regard the process for calculating the texture coordinate of this image as, because finally Image effect to be piece member tinter carry out that treated as a result, can be understood as with not to each pixel in two dimensional image Same texture coordinate goes to position different in original two dimensional image to sample different pixels to constitute final effect picture.
The texture coordinate inputted in the present embodiment, value range is 0 to 1, for convenience of calculation, here default Texture coordinate (0.5,0.5) central point is transformed into (0.0,0.0) point, i.e. u, and v value subtracts 0.5 simultaneously, i.e., by this range conversion to- Between 0.5 to 0.5.In subsequent steps, we can construct a virtual sphere, and the diameter of sphere is 1, therefore, the above change Texture coordinate range after changing just meets the coordinate range of virtual sphere.
Step 2, the 3D apex coordinate texture coordinate being converted into virtual sphere;
The difference in " virtual sphere " and true three-dimensional space be virtual sphere be simulated in two-dimensional space, and It is not real three-dimensional space.Real three-dimensional space also needs that above-mentioned to need to be added to the depth of frame buffer zone one slow Punching could embody Z axis, i.e. depth information.Because we are not the drawing image in real three-dimensional space, in order to incite somebody to action Texture coordinate is converted into the 3D apex coordinate in virtual sphere, we are converted using following formula:
Wherein, (X, Y) is texture coordinate, and (x, y, z) is 3D apex coordinate.
This is a mapping relations, i.e., the texture coordinate of original two dimensional image is converted into x, tri- components of y, z, in this way I Can obtain the mapping point of three-dimensional sphere in two-dimensional surface.
In order to make the image effect finally showed that there is diversity, the texture coordinate is being converted into virtual sphere In 3D apex coordinate after, further include:Three-dimensional rotation is carried out to the 3D apex coordinate using preset three-dimension varying matrix Transformation obtains the 2nd 3D apex coordinate;The three-dimension varying matrix includes:
The three-dimension varying matrix of X-direction:[1.0,0.0,0.0,0.0,cos(x),sin(x),0.0,-sin(x),cos (x)] it, is converted for X-axis;
The three-dimension varying matrix of Y-direction:[cos(y),0.0,-sin(y),0.0,1.0,0.0,sin(y),0.0,cos (y)] it, is converted for Y-axis;
The three-dimension varying matrix of Z-direction:[cos (z), sin (z), 0.0 ,-sin (z), cos (z), 0.0,0.0,0.0, 1.0], converted for Z axis;
Specifically, rotation change is carried out using X-direction of the three-dimension varying matrix of the X-direction to the 3D apex coordinate It changes, rotation transformation is carried out to the Y-direction of the 3D apex coordinate using the three-dimension varying matrix of the Y-direction, using the side Z To three-dimension varying matrix rotation transformation is carried out to the Z-direction of the 3D apex coordinate, obtain the 2nd 3D apex coordinate;Above-mentioned square X in battle array, y, z represent the radian of rotation, need the pre- radian that is first converted into be calculated in the case where unit is degree.
It, can also be by the three-dimension varying of the X-direction in order to not have to above three matrix all to pass in GPU to do calculating Matrix, the three-dimension varying matrix of Y-direction, Z-direction three-dimension varying matrix multiple, obtain the second three-dimension varying matrix;Then will The 3D apex coordinate and the second three-dimension varying matrix multiple, to obtain the 2nd 3D apex coordinate.That is these three matrixes All transformation finally only need to be indicated with the second three-dimension varying matrix, then the transformation matrix for being ultimately passed to GPU only needs one It is a, and this matrix illustrates the rotation on three directions.
For the purposes of making the image effect finally showed that there is diversity, preset three-dimension varying matrix pair is being used The 3D apex coordinate carries out three-dimensional rotation transformation, after obtaining the 2nd 3D apex coordinate, further includes:To the 2nd vertex 3D Coordinate zooms in and out, and obtains the 3rd 3D apex coordinate.The method zoomed in and out can add after the 2nd 3D apex coordinate One scaled matrix, but in order to realize that conveniently the present embodiment directly increases a variable in piece member tinter, allows the variable Texture is assigned to multiplied by texture coordinate at this time after texture coordinate is converted to obtain and be put centered on (0,0) point and with this result Coordinate also can reach same zooming effect.
Needing exist for clearly above-mentioned matrix multiplication is that premultiplication or the right side multiply, different multiplication sequences, and the result obtained is different. In the present embodiment, the above matrix is that the right side multiplies.
The 3D apex coordinate is mapped back two-dimensional surface, obtains the transformation texture coordinate by step 3.
The mapping method of step 3 is:The 3D apex coordinate is converted into the coordinate representation mode in spherical coordinate system, is obtained Take the longitude and latitude of the 3D apex coordinate;The longitude of the 3D apex coordinate and latitude are normalized, obtained The transformation texture coordinate.Detailed process is as follows:
As shown in Fig. 2, spherical coordinate system indicates radial distance, zenith angle, azimuth usually using (r, θ, φ) in the world. In the present embodiment, we will use the representation of this label agreement.Here it is respectively how many to need to calculate these three values. We have calculated the three-dimensional coordinate of virtual sphere in rectangular coordinate system (x, y, z) for front, now only need and calculate again Its corresponding spherical coordinate system coordinate.Rectangular coordinate system (x, y, z) and the transformational relation of spherical coordinate system (r, θ, φ) are:
Wherein, r represents the radius of a ball, and θ represents latitude, and φ represents longitude, that is, calculate its antitrigonometric function can solve it is corresponding Radian.
X, y, after z virtual three-dimensional coordinate brings above formula into, θ latitude, φ longitude are calculated.The value range of geographic latitude --- 90 ° of N (north latitude is denoted as N), 0 ° --- 90 ° of S at 0 ° (south latitude is denoted as S).The value range of geographic logitude is at 0 ° --- 180 ° of 0 ° of E (east longitude is denoted as E) --- 180 ° of W (west longitude is denoted as W).But it mathematically, can be in our calculating The value range of latitude is numerically thought 0 to π, and the value range of longitude is in 0 to 2 π, we are carrying out final transformation in this way When the calculating of texture coordinate, i.e., it will become simple when longitude and latitude are normalized.
Since the domain of arctan function is R real number range, codomain is (- pi/2, pi/2), the domain of arccos function For [- 1,1], codomain is [0, π], and therefore, the longitude φ calculated is possible to as negative value, this is just at subsequent normalization Reason brings trouble, so having needed exist for a judgement:When φ value is less than 0, the φ value calculated is allowed to add 2 π, The position that it is represented is the same, only becomes positive value from negative value.φ value is by just having changed to (0,2 π) after this single stepping.
Since the range of texture coordinate value is between 0 to 1, so final step is at the normalization of texture coordinate value Reason, we only need longitude φ that can calculate final respective pixel sample divided by π divided by 2 π, latitude value θ here It sets, i.e., corresponding transformation texture coordinate.I.e.:
U=φ/2 π, v=θ/π
Wherein, φ is longitude of the 3D apex coordinate in spheroidal coordinate system, and θ is the 3D apex coordinate in sphere Latitude in coordinate system,(U, v) be transformation after texture coordinate.
Here(U, v) it is exactly to be mapped by virtual sphere, finally indicated after mapping most using spheroidal coordinate system Transformation texture coordinate eventually, i.e., can be sampled out using (u, the v) value finally calculated in the pixel corresponding position of two dimensional image Eventually with the celestial body effect picture of transform effect.
By calculating above, final available celestial body effect picture is had been obtained in we, but for the multiplicity of effect Property and interest, further on this basis can also be handled, i.e., using (u, the v) value calculated, then be some spies Surely the transformation changed such as stretches (u, v) value, or compression (u, v) value, the effect picture showed in this way will have diversity.
Last to need to handle there are also a step, i.e., the celestial body effect picture obtained in this way is very stiff in the seam crossing of image, also There is no good transition after transformation at the edge and edge for being two dimensional image, use the transparent hybrid technology of Alpha here The image seam crossing of celestial body effect picture is smoothed, smooth celestial body effect picture is obtained.
Specific practice is that a smooth transition value is arranged then to judge u value again such as blur value.If u value is greater than this Blur value, then be not processed;If u value is less than blur value, use 1.0-u value as counter edge image data, then again Alpha is carried out with former data using this data to mix.Mixed formulation is as follows:
Src*alpha+dst* (1.0-alpha),
Here alpha value can be used u/blur linearly to obtain the smooth transition value between 0 to 1.Src represents source figure Picture, dst represent target image, i.e., carry out a transparency blending with source images and target image and operate, and blur represents mixing journey Degree, the alpha value being equal in formula here.By above formula it is found that being calculated if alpha is equal to 0 by this formula Just there was only target image out;If alpha is equal to 1, just active image is calculated by this formula;If Alpha is the numerical value between 0 to 1, then the mixed effect of source images and target image can be obtained.
After writing tinter, last we only need to compile tinter, connection tinter, and call rendering API direct Rendering can draw out final celestial body effect picture required for us.
Celestial body effect drawing generating method provided in an embodiment of the present invention, it is described for obtaining using the tinter write in advance Original two dimensional image texture, texture coordinate, apex coordinate, various transformation parameters;The tinter is used to construct celestial body effect, A key generates the celestial body effect picture of original two dimensional image i.e. by the way of program direct compilation.Star is generated using PS with existing The technology of ball effect picture is compared, and user does not need to carry out complicated operation, directly can be obtained celestial body using this coloration program Effect greatly improves user experience.Also, in writing in algorithm for tinter, using by the texture of original two dimensional image 3D apex coordinate is mapped back two with the representation of spheroidal coordinate system again by 3D apex coordinate that coordinate is converted into virtual sphere The mode of dimensional plane obtains the transformation texture coordinate of original two dimensional image, this is a kind of virtual three-dimensional reconstruction, can be with It obtains but calculating but more simple with treatment process with using the effect really come with depth buffer as renders three-dimensional object It is single efficient.In tinter, by adjusting parameter, adjustable celestial body effect picture shows effect, for example, in X, Y, Z tri- Movement, scaling, rotation, stretching, compression on direction, smooth etc., make celestial body effect picture to show effect more various, meet and use The various demands in family.In addition, the whole process that the present invention obtains celestial body effect picture is completed in a tinter, make in other needs It is convenient to be extended and be extended with the place of this function, substantially increase development efficiency.
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, any Those familiar with the art in the technical scope disclosed by the present invention, can easily think of the change or the replacement, and should all contain Lid is within protection scope of the present invention.

Claims (10)

1. a kind of celestial body effect drawing generating method, which is characterized in that including:
Create OPENGL ES rendering contexts;
Input original two dimensional image to be processed;
Using the tinter write in advance, and the API of the OPENGL ES rendering contexts is called to draw the original two dimensional image Produce celestial body effect picture;The tinter is used to calculate the transformation texture coordinate of the original two dimensional image, the transformation texture Coordinate is for constructing celestial body effect.
2. celestial body effect drawing generating method according to claim 1, which is characterized in that original described in the shader computations The method of the transformation texture coordinate of two dimensional image includes:
Obtain the texture coordinate of the original two dimensional image;
The texture coordinate is converted into the 3D apex coordinate in preset virtual sphere;
The 3D apex coordinate is mapped back into two-dimensional surface, obtains the transformation texture coordinate.
3. celestial body effect drawing generating method according to claim 2, which is characterized in that described to convert the texture coordinate Method at the 3D apex coordinate in preset virtual sphere is:
Wherein, (X, Y) is texture coordinate, and (x, y, z) is 3D apex coordinate;
Described that the 3D apex coordinate is mapped back two-dimensional surface, the method for obtaining the transformation texture coordinate is:
The 3D apex coordinate is converted into the coordinate representation mode in spherical coordinate system, obtains the 3D apex coordinate in spherical coordinates Longitude and latitude in system;
Longitude of the 3D apex coordinate in spherical coordinate system and latitude are normalized, the transformation texture seat is obtained Mark.
4. celestial body effect drawing generating method according to claim 3, which is characterized in that described that the 3D apex coordinate exists Longitude and latitude in spherical coordinate system are normalized, and the method for obtaining the transformation texture coordinate is:
U=φ/2 π, v=θ/π
Wherein, φ is longitude of the 3D apex coordinate in spherical coordinate system, and θ is the 3D apex coordinate in spherical coordinate system Latitude, (u, v) are transformation texture coordinate.
5. celestial body effect drawing generating method according to claim 2, which is characterized in that turn the texture coordinate described It changes into after the 3D apex coordinate in preset virtual sphere, further includes:
Three-dimensional rotation transformation is carried out to the 3D apex coordinate using preset three-dimension varying matrix, the 2nd vertex 3D is obtained and sits Mark;The three-dimension varying matrix includes:The three-dimension varying matrix of X-direction, the three-dimension varying matrix of Y-direction, the three-dimensional of Z-direction become Change matrix;
The 2nd 3D apex coordinate is mapped back into two-dimensional surface, obtains the transformation texture coordinate.
6. celestial body effect drawing generating method according to claim 5, which is characterized in that described using preset three-dimensional change It changes matrix and three-dimensional rotation transformation is carried out to the 3D apex coordinate, after obtaining the 2nd 3D apex coordinate, further include:
The 2nd 3D apex coordinate is zoomed in and out, the 3rd 3D apex coordinate is obtained;
The 3rd 3D apex coordinate is mapped back into two-dimensional surface, obtains the transformation texture coordinate.
7. celestial body effect drawing generating method according to claim 5, which is characterized in that described to use preset three-dimension varying Matrix carries out three-dimensional rotation transformation to the 3D apex coordinate, and the method for obtaining the 2nd 3D apex coordinate includes:
Rotation transformation is carried out to the X-direction of the 3D apex coordinate using the three-dimension varying matrix of the X-direction, using the Y The three-dimension varying matrix in direction carries out rotation transformation to the Y-direction of the 3D apex coordinate, using the three-dimension varying of the Z-direction Matrix carries out rotation transformation to the Z-direction of the 3D apex coordinate, obtains the 2nd 3D apex coordinate;
Alternatively,
By the three-dimension varying matrix of the X-direction, the three-dimension varying matrix of Y-direction, Z-direction three-dimension varying matrix multiple, obtain Take the second three-dimension varying matrix;By the 3D apex coordinate and the second three-dimension varying matrix multiple, the 2nd vertex 3D is obtained Coordinate.
8. celestial body effect drawing generating method according to claim 1, which is characterized in that the tinter is also used to described Transformation texture coordinate is stretched or is compressed.
9. celestial body effect drawing generating method according to claim 1, which is characterized in that the tinter is also used to described The image seam crossing of celestial body effect picture is smoothed, and obtains smooth celestial body effect picture.
10. celestial body effect drawing generating method according to claim 9, which is characterized in that use the transparent hybrid technology of Alpha The image seam crossing of the celestial body effect picture is smoothed, smooth celestial body effect picture is obtained.
CN201810739039.3A 2018-07-06 2018-07-06 Method for generating star effect map Active CN108921778B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810739039.3A CN108921778B (en) 2018-07-06 2018-07-06 Method for generating star effect map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810739039.3A CN108921778B (en) 2018-07-06 2018-07-06 Method for generating star effect map

Publications (2)

Publication Number Publication Date
CN108921778A true CN108921778A (en) 2018-11-30
CN108921778B CN108921778B (en) 2022-12-30

Family

ID=64423250

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810739039.3A Active CN108921778B (en) 2018-07-06 2018-07-06 Method for generating star effect map

Country Status (1)

Country Link
CN (1) CN108921778B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110570482A (en) * 2019-08-07 2019-12-13 一诺仪器(中国)有限公司 Constellation diagram drawing method and system and spectrum analyzer
CN111311716A (en) * 2020-02-27 2020-06-19 Oppo广东移动通信有限公司 Animation playing method and device, terminal equipment and storage medium

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184236A (en) * 2002-12-03 2004-07-02 Sumitomo Rubber Ind Ltd Measuring method for rotational characteristics and flight characteristics of spherical body
US20050075167A1 (en) * 2001-08-09 2005-04-07 Igt Game interaction in 3-D gaming environments
CN102903139A (en) * 2012-09-07 2013-01-30 罗健欣 Accelerated rendering method for contours
CN103021013A (en) * 2012-11-28 2013-04-03 无锡羿飞科技有限公司 High-efficiency processing method for spherical display and rotary output image of projector
CN103544677A (en) * 2013-11-01 2014-01-29 中国人民解放军信息工程大学 Space-air-ground integration situational expression engine and shaking elimination method
CN103605883A (en) * 2013-11-01 2014-02-26 中国人民解放军信息工程大学 Space-air-earth-integrated situation expression engine and engine viewpoint cross-scale seamless switching method
US20140085295A1 (en) * 2012-09-21 2014-03-27 Tamaggo Inc. Direct environmental mapping method and system
CN104637089A (en) * 2015-02-15 2015-05-20 腾讯科技(深圳)有限公司 Three-dimensional model data processing method and device
US20150310623A1 (en) * 2014-04-04 2015-10-29 Imagineer Systems Ltd. Processing of digital motion images
CN105701828A (en) * 2016-01-14 2016-06-22 广州视睿电子科技有限公司 Image-processing method and device
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal
CN106101741A (en) * 2016-07-26 2016-11-09 武汉斗鱼网络科技有限公司 Internet video live broadcasting platform is watched the method and system of panoramic video
CN106210859A (en) * 2016-08-11 2016-12-07 合网络技术(北京)有限公司 Panoramic video rendering intent and device
CN106355634A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Sun simulating method and device
CN106710003A (en) * 2017-01-09 2017-05-24 成都品果科技有限公司 Three-dimensional photographing method and system based on OpenGL ES (Open Graphics Library for Embedded System)
CN106851244A (en) * 2017-01-10 2017-06-13 北京阿吉比科技有限公司 The method and system of 3D panoramic videos are watched based on internet video live broadcasting platform
CN107093207A (en) * 2017-04-12 2017-08-25 武汉大学 A kind of dynamic and visual method of the natural gas leaking diffusion based on GPGPU
CN107146274A (en) * 2017-05-05 2017-09-08 上海兆芯集成电路有限公司 Image data processing system, texture mapping compression and the method for producing panoramic video
US20170287107A1 (en) * 2016-04-05 2017-10-05 Qualcomm Incorporated Dual fisheye image stitching for spherical video
CN107248193A (en) * 2017-05-22 2017-10-13 北京红马传媒文化发展有限公司 The method, system and device that two dimensional surface is switched over virtual reality scenario
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
JP2018026989A (en) * 2016-08-13 2018-02-15 英文 久保田 Method for increasing and decreasing gravity on planet
CN107945274A (en) * 2017-12-26 2018-04-20 苏州蜗牛数字科技股份有限公司 A kind of crater terrain generation method and device based on fertile sharp noise
CN108154553A (en) * 2018-01-04 2018-06-12 中测新图(北京)遥感技术有限责任公司 The seamless integration method and device of a kind of threedimensional model and monitor video

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050075167A1 (en) * 2001-08-09 2005-04-07 Igt Game interaction in 3-D gaming environments
JP2004184236A (en) * 2002-12-03 2004-07-02 Sumitomo Rubber Ind Ltd Measuring method for rotational characteristics and flight characteristics of spherical body
CN102903139A (en) * 2012-09-07 2013-01-30 罗健欣 Accelerated rendering method for contours
US20140085295A1 (en) * 2012-09-21 2014-03-27 Tamaggo Inc. Direct environmental mapping method and system
CN103021013A (en) * 2012-11-28 2013-04-03 无锡羿飞科技有限公司 High-efficiency processing method for spherical display and rotary output image of projector
CN103544677A (en) * 2013-11-01 2014-01-29 中国人民解放军信息工程大学 Space-air-ground integration situational expression engine and shaking elimination method
CN103605883A (en) * 2013-11-01 2014-02-26 中国人民解放军信息工程大学 Space-air-earth-integrated situation expression engine and engine viewpoint cross-scale seamless switching method
US20150310623A1 (en) * 2014-04-04 2015-10-29 Imagineer Systems Ltd. Processing of digital motion images
CN104637089A (en) * 2015-02-15 2015-05-20 腾讯科技(深圳)有限公司 Three-dimensional model data processing method and device
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal
CN105701828A (en) * 2016-01-14 2016-06-22 广州视睿电子科技有限公司 Image-processing method and device
US20170287107A1 (en) * 2016-04-05 2017-10-05 Qualcomm Incorporated Dual fisheye image stitching for spherical video
CN106101741A (en) * 2016-07-26 2016-11-09 武汉斗鱼网络科技有限公司 Internet video live broadcasting platform is watched the method and system of panoramic video
CN106210859A (en) * 2016-08-11 2016-12-07 合网络技术(北京)有限公司 Panoramic video rendering intent and device
JP2018026989A (en) * 2016-08-13 2018-02-15 英文 久保田 Method for increasing and decreasing gravity on planet
CN106355634A (en) * 2016-08-30 2017-01-25 北京像素软件科技股份有限公司 Sun simulating method and device
CN106710003A (en) * 2017-01-09 2017-05-24 成都品果科技有限公司 Three-dimensional photographing method and system based on OpenGL ES (Open Graphics Library for Embedded System)
CN106851244A (en) * 2017-01-10 2017-06-13 北京阿吉比科技有限公司 The method and system of 3D panoramic videos are watched based on internet video live broadcasting platform
CN107093207A (en) * 2017-04-12 2017-08-25 武汉大学 A kind of dynamic and visual method of the natural gas leaking diffusion based on GPGPU
CN107146274A (en) * 2017-05-05 2017-09-08 上海兆芯集成电路有限公司 Image data processing system, texture mapping compression and the method for producing panoramic video
CN107248193A (en) * 2017-05-22 2017-10-13 北京红马传媒文化发展有限公司 The method, system and device that two dimensional surface is switched over virtual reality scenario
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
CN107945274A (en) * 2017-12-26 2018-04-20 苏州蜗牛数字科技股份有限公司 A kind of crater terrain generation method and device based on fertile sharp noise
CN108154553A (en) * 2018-01-04 2018-06-12 中测新图(北京)遥感技术有限责任公司 The seamless integration method and device of a kind of threedimensional model and monitor video

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨剑雄: "ActionScript3.0实现3D星球绕转效果", 《电脑知识与技术》 *
王雨霞: "三维地理空间数据的典型图层可视化研究与实现", 《中国优秀硕士学位论文全文数据库 基础科学辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110570482A (en) * 2019-08-07 2019-12-13 一诺仪器(中国)有限公司 Constellation diagram drawing method and system and spectrum analyzer
CN110570482B (en) * 2019-08-07 2022-12-09 一诺仪器(中国)有限公司 Constellation diagram drawing method and system and spectrum analyzer
CN111311716A (en) * 2020-02-27 2020-06-19 Oppo广东移动通信有限公司 Animation playing method and device, terminal equipment and storage medium
CN111311716B (en) * 2020-02-27 2023-05-12 Oppo广东移动通信有限公司 Animation playing method, device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN108921778B (en) 2022-12-30

Similar Documents

Publication Publication Date Title
CN104463948B (en) Seamless visualization method for three-dimensional virtual reality system and geographic information system
CN101689306B (en) Efficient 2-d and 3-d graphics processing
Heckbert Fundamentals of texture mapping and image warping
US20060232584A1 (en) Stereoscopic picture generating apparatus
CN106204712B (en) Piecewise linearity irregularly rasterizes
US9595080B2 (en) Implementing and interpolating rotations from a computing input device
CN107392988A (en) System, the method and computer program product for being used to render with variable sampling rate using perspective geometry distortion
Zhao et al. Conformal magnifier: A focus+ context technique with local shape preservation
US10991068B2 (en) Projection image construction method and device
CN108133454A (en) Model space geometric image switching method, device, system and interactive device
CN108921778A (en) A kind of celestial body effect drawing generating method
CN106558017A (en) Spherical display image processing method and system
CN104517313B (en) The method of ambient light masking based on screen space
Penaranda et al. Real-time correction of panoramic images using hyperbolic Möbius transformations
Yan et al. A non-photorealistic rendering method based on Chinese ink and wash painting style for 3D mountain models
JP2020532022A (en) Sphere light field rendering method in all viewing angles
CN109816761A (en) Figure conversion method, device, storage medium and electronic equipment
CN112862981B (en) Method and apparatus for presenting a virtual representation, computer device and storage medium
CN105913473A (en) Realization method and system of scrolling special efficacy
JPS61183781A (en) Distortion expressing system of two-dimensional image
CN114937117A (en) Thermodynamic diagram rendering method, system, medium and equipment
CN110264419B (en) Image style conversion method, system, equipment and medium for realizing oil painting effect
CN109360263A (en) A kind of the Real-time Soft Shadows generation method and device of resourceoriented restricted movement equipment
CN110728619B (en) Panoramic image stitching rendering method and device
CN116883566A (en) Mapping-based model edge tracing method, mapping-based model edge tracing device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant