CN104134201A - Texture image stitching method and device - Google Patents

Texture image stitching method and device Download PDF

Info

Publication number
CN104134201A
CN104134201A CN201410373151.1A CN201410373151A CN104134201A CN 104134201 A CN104134201 A CN 104134201A CN 201410373151 A CN201410373151 A CN 201410373151A CN 104134201 A CN104134201 A CN 104134201A
Authority
CN
China
Prior art keywords
pixel
qietu
xin
texture
scale down
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410373151.1A
Other languages
Chinese (zh)
Other versions
CN104134201B (en
Inventor
汪月娇
潘桂聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Group Co Ltd
Original Assignee
Vtron Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Technologies Ltd filed Critical Vtron Technologies Ltd
Priority to CN201410373151.1A priority Critical patent/CN104134201B/en
Publication of CN104134201A publication Critical patent/CN104134201A/en
Application granted granted Critical
Publication of CN104134201B publication Critical patent/CN104134201B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a texture image stitching method and device. The method comprises the steps that pixels of an original segmented image are spread around, so that a new segmented image is obtained; the shrinkage proportion is determined according to the pixels of the original segmented image and pixels of the new segmented image, and a preset sampling range of texture coordinates is shrunk according to the shrinkage proportion; displacement to be shifted is determined according to the spread pixels and the pixels of the new segmented image, and the shrunk texture coordinates are shifted according to the displacement to be shifted; the new segmented image is sampled according to the range of the shifted texture coordinates is carried out. The method is simple and high in efficiency, seamless stitching of a plurality of texture images is achieved, rendered texture is clear, and the display effect is good.

Description

The method and apparatus of texture image splicing
Technical field
The present invention relates to simulation figure technical field, particularly relate to a kind of method and apparatus of texture image splicing.
Background technology
In simulating scenes, represent natural world true appearance the most intuitively method use exactly three-dimensional model.But because objective world is intricate, the increase of modeling difficulty, three-dimensional model can not meet actual needs, so people combine by image processing techniques and computer graphics techniques the sense of reality demonstration problem in virtual scene that solves conventionally, for example the mode by texture mapping shows the details that in simulating scenes, three dimensional object surface can not be embodied, thereby make up the deficiency of drawing for how much, improve the authenticity that image shows.
When carrying out texture mapping, image there will be on the position of the corresponding texture coordinate in three dimensional object surface, and texture coordinate is limited in 0 to 1 scope conventionally.If super, go beyond the scope, the mapping function by texture is decided.
In simulating scenes, the scene entity of playing up out for the mode that adopts perspective projection, texture is generally too large or too little.For this situation, need to filter texture with matching entities object.For simplifying the process of texture filtering, render engine can provide the texture filtering of three types: lf, anisotropic filtering and mipmap (multum in parvo map) filter.If do not select texture filtering mode, render engine can adopt the technology of the closest approach sampling of acquiescence, uses the texture cell near pixel center.If selection lf, to getting weighted mean value etc. near the n*n texture cell in pixel center region.
In actual emulation scene, in the less demanding situation of grain details, what the method for large area generation texture adopted is repetition Texture Mapping Technology, and this repetition Texture Mapping Technology can be realized the seamless spliced of texture, but method is complicated, and efficiency is lower.
Summary of the invention
Based on this, be necessary for the problems referred to above, provide that a kind of method is simple, efficiency is higher, can realize the method and apparatus of seamless spliced texture image splicing.
A method for texture image splicing, comprises step:
To the former pixel of cutting figure of surrounding expansion, obtain Xin Qietu;
According to the pixel of the pixel of the former figure of cutting and Xin Qietu, determine scale down, the default sample range of texture coordinate is dwindled according to described scale down;
According to the pixel of the pixel of expansion and Xin Qietu, determine and treat offset displacement, the texture coordinate after dwindling is treated to offset displacement moves according to described;
According to the scope of the texture coordinate after movement, described Xin Qietu is sampled.
A device for texture image splicing, comprising:
Xin Qietu generation module, for to the former pixel of cutting figure of surrounding expansion, obtains Xin Qietu;
Sample range arranges module, for determining scale down according to the pixel of the pixel of the former figure of cutting and Xin Qietu, the default sample range of texture coordinate is dwindled according to described scale down;
Texture coordinate mobile module, for determining and treat offset displacement according to the pixel of the pixel of expansion and Xin Qietu, treats that according to described offset displacement moves by the texture coordinate after dwindling;
Sampling module, for sampling to described Xin Qietu according to the scope of the texture coordinate after movement.
The method and apparatus of texture image splicing of the present invention, while mutually comparing with prior art, possesses following advantage:
1, the present invention is by expanding the pixel of the former figure of cutting and dwindling the modes such as texture coordinate sample range to cutting the figure splicing of sampling, the disappearance of part texture cell while avoiding sampling texture image border, thereby realized the seamless spliced of many texture images, and the clean mark of playing up, display effect is relatively good;
2, the inventive method is simple, and entity can be bound a plurality of textures and realize seamless splicedly, and rendering efficiency is higher;
3, the present invention is not only applicable to plane map seamless tiled display, and is applicable to the seamless spliced etc. of many textures of megarelief in virtual scene, and the scope of application is wider.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the inventive method embodiment;
Fig. 2 is the schematic diagram of Xin Qietu embodiment mono-of the present invention;
Fig. 3 is the schematic diagram of Xin Qietu embodiment bis-of the present invention;
Fig. 4 is that the Xin Qietu of the specific embodiment of the invention divides schematic diagram;
Fig. 5 is the schematic diagram after the texture coordinate scope of the specific embodiment of the invention is dwindled;
Fig. 6 is the schematic diagram after the texture coordinate of the specific embodiment of the invention moves;
Fig. 7 is the structural representation of apparatus of the present invention embodiment.
Embodiment
Below in conjunction with accompanying drawing, the embodiment of the method for texture image splicing of the present invention is described in detail.
As shown in Figure 1, a kind of method of texture image splicing, comprises step:
S110, to the former pixel of cutting figure of surrounding expansion, obtain Xin Qietu;
When the reason that texture piece produces is the edge of sampling local texture image, part texture cell disappearance, cuts figure so solution piece problem need to be repartitioned texture, and to the former pixel of cutting figure of external expansion, the figure adjacent part of cutting after expansion has overlapping;
The minimum value of the pixel of expansion can be definite according to video card decimation factor, and for example the decimation factor of video card is 2*2, and the minimum value of the pixel of expansion is 1; The pixel all directions of expansion can be identical, also can be different, for example, as shown in Figure 2, the former figure of cutting all can be expanded to n pixel to surrounding, also can be as shown in Figure 3, by expanding n pixel in the former figure of cutting horizontal direction, in vertical direction, expand m pixel etc.; In order to improve the operating efficiency of video card, the pixel of expansion is generally got minimum value;
Suppose that the former figure of cutting size is size*size, the former figure of cutting before Xin Qietu compares edge increases n pixel more, and the size of Xin Qietu is (size+2*n) * (size+2*n) so.The preserving type of picture is upper left corner coordinate, width and height, as shown in Figure 2, if the upper left corner coordinate of horizontal first Xin Qietu is [0,0], the upper left corner coordinate of second so horizontal Xin Qietu is [size, 0], the upper left corner coordinate of the 3rd horizontal Xin Qietu is [2*size, 0], the like the horizontal j upper left corner coordinate of opening Xin Qietu be [j*size, 0]; Second Xin Qietu upper left corner coordinate is [0, size] longitudinally ... the like the i upper left corner coordinate of opening Xin Qietu be [0, i*size];
S120, according to the pixel of the pixel of the former figure of cutting and Xin Qietu, determine scale down, the default sample range of texture coordinate is dwindled according to described scale down;
Texture coordinate has defined which part of texture image will be mapped on entity, and the texture coordinate of sampling being cut to figure is arranged in [0,1] scope, then the scope of texture coordinate is dwindled according to definite scale down;
S130, according to the pixel of expansion and the pixel of Xin Qietu, determine and treat offset displacement, the texture coordinate after dwindling is treated to offset displacement moves according to described;
For the effect that the texture pixel of the Xin Qietu of sampling is shown is the same with the former figure of cutting, solve the piece problem between different texture simultaneously, the texture coordinate after reducing the scope need to be moved according to the offset displacement for the treatment of of determining;
S140, according to the scope of the texture coordinate after movement, described Xin Qietu is sampled, texture pixel display effect out of sampling is with former to cut figure the same, and while having avoided sampling texture image border, the disappearance of part texture cell, has realized the seamless spliced of many textures.
Scale down in step S120 can comprise the scale down of horizontal direction and the scale down of vertical direction;
The scale down of horizontal direction is the former ratio of cutting the horizontal pixel of figure and the horizontal pixel of Xin Qietu;
The scale down of vertical direction is the former ratio of cutting the vertical pixel of figure and the vertical pixel of Xin Qietu.
If the abducent pixel of the former figure of cutting is n, texture coordinate is arranged on [0,1] in scope, as shown in Figure 2, the scale down of horizontal direction is x/ (x+2n) so, the scale down of vertical direction is y/ (y+2n), and wherein x is former pixel of cutting figure horizontal direction, and y is former pixel of cutting figure vertical direction.Scope in texture coordinate horizontal direction becomes [0, x/ (x+2n)] so, and the scope in vertical direction becomes [0, y/ (y+2n)].
If the pixel of the former figure of cutting horizontal extension is n, the pixel of extends perpendicular is m, and texture coordinate is arranged in [0,1] scope, and as shown in Figure 3, the scale down of horizontal direction is x/ (x+2n) so, and the scale down of vertical direction is y/ (y+2m).Scope in texture coordinate horizontal direction becomes [0, x/ (x+2n)] so, and the scope in vertical direction becomes [0, y/ (y+2m)].
If the pixel of cutting the expansion of figure left and right horizontal former is different, and/or the pixel of upper and lower extends perpendicular is different, and the scale down in horizontal direction and vertical direction is determined according to above-mentioned definition equally, does not repeat them here.
Texture coordinate after reducing the scope need to move, to realize the consistent of seamless spliced and display effect.The offset displacement for the treatment of in step S130 can comprise the offset displacement for the treatment of of treating offset displacement and V direction of U direction;
The offset displacement for the treatment of of U direction is the ratio of the horizontal pixel of expansion and the horizontal pixel of Xin Qietu;
The offset displacement for the treatment of of V direction is the ratio of the vertical pixel of expansion and the vertical pixel of Xin Qietu.
If the abducent pixel of the former figure of cutting is n, texture coordinate is arranged in [0,1] scope, and the scope in the texture coordinate horizontal direction after scale down becomes [0, x/ (x+2n)], and the scope in vertical direction becomes [0, y/ (y+2n)].The texture coordinate offset displacement for the treatment of is in the horizontal direction n/ (x+2n) so, and the offset displacement for the treatment of in vertical direction is n/ (y+2n), that is to say the U value of texture coordinate is offset to n/ (x+2n) left, and V value is upwards offset n/ (y+2n).The effect of cutting the demonstration of figure texture pixel by the texture coordinate sampling after movement is the same with the former figure of cutting, and has solved the piece problem of different texture.
If the pixel of the former figure of cutting horizontal extension is n, the pixel of extends perpendicular is m, and texture coordinate is arranged on [0,1], in scope, the scope in the texture coordinate horizontal direction after scale down becomes [0, x/ (x+2n)], scope in vertical direction becomes [0, y/ (y+2m)].The texture coordinate offset displacement for the treatment of is in the horizontal direction n/ (x+2n) so, and the offset displacement for the treatment of in vertical direction is m/ (y+2m), that is to say the U value of texture coordinate is offset to n/ (x+2n) left, and V value is upwards offset m/ (y+2m).The effect of cutting the demonstration of figure texture pixel by the texture coordinate sampling after movement is the same with the former figure of cutting, and has solved the piece problem of different texture.
If the pixel of cutting the expansion of figure left and right horizontal former is different, and/or the pixel of upper and lower extends perpendicular is different, the offset displacement for the treatment of of texture coordinate U direction and V direction is determined according to above-mentioned definition equally, only the horizontal pixel of expansion is the pixel that the former figure of cutting level is expanded left, it is the pixel that the former figure of cutting initial point is horizontally outward expanded, the vertical pixel of expansion is the pixel that the former figure of cutting expands vertically downward, i.e. the former vertical abducent pixel of figure initial point of cutting.
Method, is described in detail the embodiment of the inventive method below in conjunction with a specific embodiment for a better understanding of the present invention.
As shown in Figure 4, the size of the former figure of cutting is size*size, outwards all expands after n pixel, and the size of Xin Qietu is (size+2*n) * (size+2*n);
The texture coordinate of sampling being cut to figure is all arranged in [0,1] scope, then this scope integral body is multiplied by size/ (size+2*n) and is used for dwindling texture coordinate scope, and the texture coordinate scope after dwindling as shown in Figure 5;
The U value of the texture coordinate after dwindling is offset to n/ (size+2*n) left, V value is upwards offset n/ (size+2*n), as shown in Figure 6, final sampling texture coordinate scope is [n/ (size+2*n), n/ (size+2*n)] to [(size+n)/(size+2*n), (size+n)/(size+2*n)] between.
The effect that the texture pixel of sampling out according to the texture coordinate after above-mentioned movement shows is the same with the former figure of cutting, and seamless link between different texture.
Based on same inventive concept, the present invention also provides a kind of device of texture image splicing, below in conjunction with accompanying drawing, the embodiment of apparatus of the present invention is described in detail.
As shown in Figure 7, a kind of device of texture image splicing, comprising:
Xin Qietu generation module 100, for to the former pixel of cutting figure of surrounding expansion, obtains Xin Qietu;
The minimum value of the pixel of expansion can be definite according to video card decimation factor, and for example the decimation factor of video card is 2*2, and the minimum value of the pixel of expansion is 1; In order to improve the operating efficiency of video card, the pixel of expansion is generally got minimum value;
Sample range arranges module 200, for determining scale down according to the pixel of the pixel of the former figure of cutting and Xin Qietu, the default sample range of texture coordinate is dwindled according to described scale down;
Texture coordinate has defined which part of texture image will be mapped on entity, and the texture coordinate of sampling being cut to figure is arranged in [0,1] scope, then the scope of texture coordinate is dwindled according to definite scale down;
Texture coordinate mobile module 300, for determining and treat offset displacement according to the pixel of the pixel of expansion and Xin Qietu, treats that according to described offset displacement moves by the texture coordinate after dwindling;
For the effect that the texture pixel of the Xin Qietu of sampling is shown is the same with the former figure of cutting, solve the piece problem between different texture simultaneously, the texture coordinate after reducing the scope need to be moved according to the offset displacement for the treatment of of determining;
Sampling module 400, for described Xin Qietu being sampled according to the scope of the texture coordinate after movement, texture pixel display effect out of sampling is with former to cut figure the same, and the disappearance of part texture cell while having avoided sampling texture image border, realized the seamless spliced of many textures.
Described scale down can comprise the scale down of horizontal direction and the scale down of vertical direction;
Scale down in described horizontal direction is the former ratio of cutting the horizontal pixel of figure and the horizontal pixel of Xin Qietu;
The scale down of described vertical direction is the former ratio of cutting the vertical pixel of figure and the vertical pixel of Xin Qietu.
Texture coordinate after reducing the scope need to move, to realize the consistent of seamless spliced and display effect.Describedly treat that offset displacement can comprise the offset displacement for the treatment of of treating offset displacement and V direction of U direction;
The offset displacement for the treatment of of described U direction is the ratio of the horizontal pixel of expansion and the horizontal pixel of Xin Qietu;
The offset displacement for the treatment of of described V direction is the ratio of the vertical pixel of expansion and the vertical pixel of Xin Qietu.
Other technical characterictic of apparatus of the present invention is identical with said method, does not repeat them here.
The above embodiment has only expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but can not therefore be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.

Claims (10)

1. a method for texture image splicing, is characterized in that, comprises step:
To the former pixel of cutting figure of surrounding expansion, obtain Xin Qietu;
According to the pixel of the pixel of the former figure of cutting and Xin Qietu, determine scale down, the default sample range of texture coordinate is dwindled according to described scale down;
According to the pixel of the pixel of expansion and Xin Qietu, determine and treat offset displacement, the texture coordinate after dwindling is treated to offset displacement moves according to described;
According to the scope of the texture coordinate after movement, described Xin Qietu is sampled.
2. the method for texture image splicing according to claim 1, is characterized in that, the minimum value of the pixel of expansion is determined according to video card decimation factor.
3. the method for texture image splicing according to claim 1, is characterized in that, described scale down comprises the scale down of horizontal direction and the scale down of vertical direction;
Scale down in described horizontal direction is the former ratio of cutting the horizontal pixel of figure and the horizontal pixel of Xin Qietu;
The scale down of described vertical direction is the former ratio of cutting the vertical pixel of figure and the vertical pixel of Xin Qietu.
4. the method for texture image according to claim 1 splicing, is characterized in that, described in treat that offset displacement comprises the offset displacement for the treatment of of treating offset displacement and V direction of U direction;
The offset displacement for the treatment of of described U direction is the ratio of the horizontal pixel of expansion and the horizontal pixel of Xin Qietu;
The offset displacement for the treatment of of described V direction is the ratio of the vertical pixel of expansion and the vertical pixel of Xin Qietu.
5. according to the method for the texture image splicing described in claim 1 to 4 any one, it is characterized in that, described default sample range is [0,1].
6. a device for texture image splicing, is characterized in that, comprising:
Xin Qietu generation module, for to the former pixel of cutting figure of surrounding expansion, obtains Xin Qietu;
Sample range arranges module, for determining scale down according to the pixel of the pixel of the former figure of cutting and Xin Qietu, the default sample range of texture coordinate is dwindled according to described scale down;
Texture coordinate mobile module, for determining and treat offset displacement according to the pixel of the pixel of expansion and Xin Qietu, treats that according to described offset displacement moves by the texture coordinate after dwindling;
Sampling module, for sampling to described Xin Qietu according to the scope of the texture coordinate after movement.
7. the device of texture image splicing according to claim 6, is characterized in that, the minimum value of the pixel of expansion is determined according to video card decimation factor.
8. the device of texture image splicing according to claim 6, is characterized in that, described scale down comprises the scale down of horizontal direction and the scale down of vertical direction;
Scale down in described horizontal direction is the former ratio of cutting the horizontal pixel of figure and the horizontal pixel of Xin Qietu;
The scale down of described vertical direction is the former ratio of cutting the vertical pixel of figure and the vertical pixel of Xin Qietu.
9. the device of texture image according to claim 6 splicing, is characterized in that, described in treat that offset displacement comprises the offset displacement for the treatment of of treating offset displacement and V direction of U direction;
The offset displacement for the treatment of of described U direction is the ratio of the horizontal pixel of expansion and the horizontal pixel of Xin Qietu;
The offset displacement for the treatment of of described V direction is the ratio of the vertical pixel of expansion and the vertical pixel of Xin Qietu.
10. according to the device of the texture image splicing described in claim 6 to 9 any one, it is characterized in that, described default sample range is [0,1].
CN201410373151.1A 2014-07-31 2014-07-31 The method and apparatus of texture image splicing Expired - Fee Related CN104134201B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410373151.1A CN104134201B (en) 2014-07-31 2014-07-31 The method and apparatus of texture image splicing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410373151.1A CN104134201B (en) 2014-07-31 2014-07-31 The method and apparatus of texture image splicing

Publications (2)

Publication Number Publication Date
CN104134201A true CN104134201A (en) 2014-11-05
CN104134201B CN104134201B (en) 2017-03-29

Family

ID=51806870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410373151.1A Expired - Fee Related CN104134201B (en) 2014-07-31 2014-07-31 The method and apparatus of texture image splicing

Country Status (1)

Country Link
CN (1) CN104134201B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107526504A (en) * 2017-08-10 2017-12-29 广州酷狗计算机科技有限公司 Method and device, terminal and the storage medium that image is shown
CN111870955A (en) * 2020-07-24 2020-11-03 上海米哈游天命科技有限公司 Height map generation method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100189359A1 (en) * 2007-06-15 2010-07-29 Shinichiro Gomi Image processing apparatus, image processing method, program of image processing method, and recording medium having program of image processing method recorded thereon
CN102663801A (en) * 2012-04-19 2012-09-12 北京天下图数据技术有限公司 Method for improving three-dimensional model rendering performance
CN102737097A (en) * 2012-03-30 2012-10-17 北京峰盛博远科技有限公司 Three-dimensional vector real-time dynamic stacking technique based on LOD (Level of Detail) transparent textures
CN103473750A (en) * 2013-08-02 2013-12-25 毕胜 Rendering-fabric boundary fusion splicing method
CN103593862A (en) * 2013-11-21 2014-02-19 广东威创视讯科技股份有限公司 Image display method and control unit

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100189359A1 (en) * 2007-06-15 2010-07-29 Shinichiro Gomi Image processing apparatus, image processing method, program of image processing method, and recording medium having program of image processing method recorded thereon
CN102737097A (en) * 2012-03-30 2012-10-17 北京峰盛博远科技有限公司 Three-dimensional vector real-time dynamic stacking technique based on LOD (Level of Detail) transparent textures
CN102663801A (en) * 2012-04-19 2012-09-12 北京天下图数据技术有限公司 Method for improving three-dimensional model rendering performance
CN103473750A (en) * 2013-08-02 2013-12-25 毕胜 Rendering-fabric boundary fusion splicing method
CN103593862A (en) * 2013-11-21 2014-02-19 广东威创视讯科技股份有限公司 Image display method and control unit

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107526504A (en) * 2017-08-10 2017-12-29 广州酷狗计算机科技有限公司 Method and device, terminal and the storage medium that image is shown
CN111870955A (en) * 2020-07-24 2020-11-03 上海米哈游天命科技有限公司 Height map generation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN104134201B (en) 2017-03-29

Similar Documents

Publication Publication Date Title
CN105122311B (en) For rendering the smooth superimposed layer of the vector data about complex three-dimensional object
US7561156B2 (en) Adaptive quadtree-based scalable surface rendering
JP4680796B2 (en) Image base protrusion displacement mapping method and double displacement mapping method using the method
CN103426163A (en) System and method for rendering affected pixels
CN104200506A (en) Method and device for rendering three-dimensional GIS mass vector data
CN104331918A (en) Occlusion culling and acceleration method for drawing outdoor ground surface in real time based on depth map
CN104050708A (en) 3D game engine LOD system achievement method
CN104954780A (en) DIBR (depth image-based rendering) virtual image restoration method applicable to high-definition 2D/3D (two-dimensional/three-dimensional) conversion
KR20180107271A (en) Method and apparatus for generating omni media texture mapping metadata
KR101591427B1 (en) Method for Adaptive LOD Rendering in 3-D Terrain Visualization System
CN112634414B (en) Map display method and device
CN110097626A (en) A kind of basse-taille object identification processing method based on RGB monocular image
TW200907854A (en) Universal rasterization of graphic primitives
CN104680532A (en) Object labeling method and device
KR20120104071A (en) 3d image visual effect processing method
CN103686139A (en) Frame image conversion method, frame video conversion method and frame video conversion device
CN104063888A (en) Pop art style drawing method based on non-photorealistic
CN115984506A (en) Method and related device for establishing model
CN104134201A (en) Texture image stitching method and device
CN102521876B (en) A kind of method and system realizing 3D user interface stereoeffect
CN103310409B (en) A kind of Tile-based renders the fast triangle block method of framework GPU
US10186073B2 (en) Image processing device, image processing method, and data structure of image file
JP2020532022A (en) Sphere light field rendering method in all viewing angles
CN105205862B (en) A kind of 3-dimensional reconstruction method and system
CN105389847A (en) Drawing system and method of 3D scene, and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Kezhu road high tech Industrial Development Zone, Guangzhou city of Guangdong Province, No. 233 510670

Patentee after: VTRON GROUP Co.,Ltd.

Address before: 510670 Guangdong city of Guangzhou province Kezhu Guangzhou high tech Industrial Development Zone, Road No. 233

Patentee before: VTRON TECHNOLOGIES Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170329

Termination date: 20210731