CN101038675A - Method and apparatus for implementing wash painting style - Google Patents
Method and apparatus for implementing wash painting style Download PDFInfo
- Publication number
- CN101038675A CN101038675A CNA2006100575998A CN200610057599A CN101038675A CN 101038675 A CN101038675 A CN 101038675A CN A2006100575998 A CNA2006100575998 A CN A2006100575998A CN 200610057599 A CN200610057599 A CN 200610057599A CN 101038675 A CN101038675 A CN 101038675A
- Authority
- CN
- China
- Prior art keywords
- texture image
- pixel
- sampling
- pixels
- contour
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
The invention discloses a method for realizing an ink-wash style rendering method which processes via a 2D texture space, the method comprises rendering a model of an object to a first floating-point texture image; sampling respective pixels on the first floating-point texture image and determining the pixels as pixels of an outline of the objcet so as to form a second texture image projecting from the outline of the objcet; sampling respective pixels on the second texture image, sampling the pixels from positions corresponding to a background texture image according to positions of the respective pixels, and mixing the pixels with the corresponding pixels sampled from the second texture image, outputting the mixed pixels to the background texture image to present the model of the object with an ink-wash style. The invention also discloses a device for realizing ink-wash style rendering.
Description
Technical field
The present invention relates to field of computer technology, relate in particular to a kind of method and device of realizing wash painting style.
Background technology
Computing machine to object model play up can be divided into the photo live stage play up (PR, PhotorealisticRendering) and non-photo live stage play up (NPR, Non-Photorealistic Rendering).PR is meant the research to the actual physical shadow, makes the rendering effect of real world; NPR is meant that the research artist sees through the abstract artistic conception that paintbrush is sketched the contours of, and develops more and ripe mostly relevantly with western paintings at present, and as sketch, pen, carbon pencil is drawn, watercolor, cartoon etc.Existing in fact: the DEMO of ATI (water color rendering), the recreation on the PS2: as Dragon Quest 8, Teenage Mutant Ninja Turtle series, Long Zhu etc. (cartoon is played up).
At present mainly based on physical space, realize the crochet and the stroke of ink and wash style with the central processing unit (CPU) of computing machine, to reach the rendering effect of inkization, its main flow process is as follows:
1, searches object model silhouette edge (handling by CPU) based on the 3D geometric space;
2, visuality is carried out on the silhouette edge summit and reject (pinning Z-buffer pursues the silhouette edge summit and relatively rejects);
3, connect the silhouette edge summit and form stroke (CPU pursues visual profile limit vertex position relatively, carries out stroke and connects);
4, on stroke, wrap artist's style of writing (to have the texture of Alpha passage, be attached to a triangle on be used for producing style of writing as pinup picture).
Though adopt said method can produce the rendering effect of ink and wash style, have following shortcoming:
1, carry out a series of crucial the processing owing to be based on CPU, so very high to the CPU requirement, not only speed is slow on the low computer of cpu performance, and poor effect.
2, it is relevant with the model of object to handle complexity, if model vertices is few, then can be with higher frame per second (fps, i.e. p.s. how many frames) operation, if but model vertices more then processing is very difficult.
3, the inkization that is only applicable to indivedual main simple models in the scene is played up, and the inkization that is not suitable for whole scene is played up.
Summary of the invention
The invention provides a kind of method that realizes wash painting style, exist hardware device performance requirement height when the inkization that realizes object model in the prior art is played up to solve, and can only carry out the problem that inkization is played up naive model.
The invention provides following technical scheme:
A kind of method that realizes wash painting style is handled by the 2D texture space, and this method comprises the steps:
A, with the model rendering of object to the first floating-point texture image;
Each pixel on B, the described first floating-point texture image of sampling also determines wherein to be the pixel of contour of object, to form second texture image of outstanding contour of object;
Each pixel of C, described second texture image of sampling, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image to present the object model of ink and wash style.
Wherein:
In steps A, further illumination is carried out to model in the selective light source position.
Among the step B, sampling during current pixel simultaneously to around pixel sample, determine according to the intensity of variation of the depth value of the pixel around described whether this current pixel is in the outline position of object.
Among the step B, before the sampling preceding pixel, further the depth value to model carries out filtration treatment.
Before step C, also comprise: each pixel of B1, described second texture image of sampling, and except the sampling current pixel, also the position with present picture element is repeatedly to sample to pixel on every side in the center, to form interim texture image; And, in step C, also mix from the correspondence position sampled pixel of interim texture image and from the respective pixel of second texture image and background texture image sampling according to each locations of pixels.
Among the step B1, also further mix from the correspondence position sampled pixel of noise texture image and with the respective pixel of sampling, make contour of object outstanding in the interim texture image have the crude effect of stroke from second texture image according to each locations of pixels in second texture image.
Among the step C, with the color minimum depth value in the pixel of current sampling in second texture image and the background texture image as mixed color of pixel value.
A kind of device of realizing wash painting style comprises:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and determine the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
A kind of device of realizing wash painting style comprises:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and be defined as the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and according to each locations of pixels from the correspondence position sampled pixel of noise texture image and with mix from the respective pixel of second texture image sampling, to form the unit that described outstanding contour of object has the texture image of the crude effect of stroke;
Each pixel of the texture of sampling image, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
Because the present invention handling based on pixel at the 2D texture space, it doesn't matter for the time complexity of processing and model vertices number, relevant with the size of the interim texture of each output.Therefore, on the lower computing machine of CPU, also can finish processing faster, also be suitable for processing complex model.
Description of drawings
Fig. 1 is for realizing the process flow diagram that inkization is played up in the embodiment of the invention one;
Fig. 2 realizes rendering with water and ink apparatus structure synoptic diagram in the embodiment of the invention one;
Fig. 3 A is to be the synoptic diagram of floating-point texture image with model rendering in the embodiment of the invention one;
Fig. 3 B is the synoptic diagram that in the embodiment of the invention one model among Fig. 3 A is carried out illumination;
Fig. 3 C is the texture image synoptic diagram of outstanding contour of object in the embodiment of the invention one;
Fig. 3 D is a background texture image synoptic diagram in the embodiment of the invention one;
Fig. 3 E is the final effect figure that carries out in the embodiment of the invention one after inkization is played up;
Fig. 4 is for realizing the process flow diagram that inkization is played up in the embodiment of the invention two;
Fig. 5 realizes rendering with water and ink apparatus structure synoptic diagram in the embodiment of the invention two;
Fig. 6 A is the synoptic diagram of sampled pixel in the embodiment of the invention two;
Fig. 6 B is the texture image synoptic diagram that the contour of object stroke has crude effect in the embodiment of the invention two;
Fig. 6 C is the final effect figure that carries out in the embodiment of the invention two after inkization is played up;
Embodiment
To the time complexity of requirements on hardware equipment and processing, to the present invention is based on the 2D texture space and come that object model is carried out ink and wash style 3D and play up when to realize that in order reducing inkization is played up.
Embodiment one
Consult shown in Figure 1ly, it is as follows in the present embodiment object model to be carried out the main process that ink and wash style 3D plays up:
On step 100, the floating-point texture image with the model rendering to 64 (being not limited to 64) of object (A16B16G16R16F, 64 altogether of expression textures, A, B, G, a R4 passage, 16 in each passage is stored with relocatable).
In this step, the depth value of object is outputed on A, G, the B passage of floating-point texture, the brightness value of object is outputed on the R passage of floating-point texture.
Consult shown in Figure 2, realize in the present embodiment that the computing machine that ink and wash style 3D plays up has the unit of basic function now except comprising realization, outside central processing unit 10, storage unit 11 etc., also further comprise rendering unit 20, first processing unit 21 and second processing unit 22.
Described rendering unit 20 is used for model rendering with object to the floating-point texture image; Described first processing unit 21 be used to sample on the described floating-point texture image each pixel and be defined as the pixel of contour of object, to form the texture image of outstanding contour of object; Each pixel in the texture image of the described outstanding contour of object of described second processing unit, 22 samplings, and according to the correspondence position sampled pixel of each locations of pixels from the background texture image of selection, and mix with respective pixel from second texture image sampling, the respective pixel that to sample from two texture images is mixed, and outputs on the described background texture image to present the object model of ink and wash style.
Playing up with the pattern coloring language ShaderModel2.0 realization of Microsoft below is that example is elaborated.ShaderModel2.0 comprises VertexShader2.0 and PixelShader2.0, have more than the FX5200 of Nvidia, or the computing machine of the above hardware of Radeon9500 of Ati can be supported ShaderModel2.0.
One, by rendering unit 20 object model (Model) is rendered into one 64 floating-point texture image I mage last (A16B16G16R16F), as shown in Figure 3A.In the render process, in PixelShader, further carry out following processing:
A, the depth value of object is outputed on A, G, the B passage of floating-point texture.The depth value of the bright more representative object of object brightness is big more.
B, as the position of light source model is carried out illumination with the position of two points among Fig. 3 B:
Color0=light0 * normal; Color after color0//first light source (light0) illumination;
Color1=light1 * normal; Color after color1//secondary light source (light1) illumination;
Colorfinal=max (color0, color1) // get two maximal values in the color;
Colorfinal is outputed to the R passage of floating-point texture.
Two, first processing unit 21 is drawn the rectangle frame identical with screen size, and floating-point texture Image is sampled, and handles in PixelShader:
A, the horizontal and vertical filtrator of employing filter the A channel of floating-point texture image I mage.Because the value of A, R, G passage is identical,, can certainly filter A, R, G passage respectively so only need filter to A channel.
B, texture image Image is pursued pixel sampling and handles, and 8 pixels around this pixel are also sampled.
SobelX SobelY Sobel
1?0?-1 1 2 1 s00?s01?s02
2?0?-2 0 0 0 s10?s11?s12
1?0?-1 -1?-2?-1 s20?s21?s22
float?sobelX=s00+2*s10+s20-s02-2*s12-s22;
float?sobelY=s00+2*s01+s02-s20-2*s21-s22;
edgeSqr=(sobelX*sobelX+sobelY*sobelY)。
Wherein, SobelX represents directions X is carried out the weights of each overanxious pixel that arrives of overanxious back, and SobelY represents the Y direction is carried out the weights of each overanxious pixel that arrives of overanxious back, and Sobel represents the position (grid of 3*3) of the overanxious pixel of whole filter for molten; S11 represents present picture element, and edgeSqr represents this pixel fierce degree of variation of pixel value on every side.
(n generally gets 0.005, can suitably finely tune if edgeSqr is greater than the threshold values n that sets.), illustrate that then the value of present picture element pixel on every side changes fierceness, and the value of pixel is the depth information of expression pixel, and therefore the present picture element change in depth fierceness of pixel on every side also is described, can determine that then present picture element is the profile of object.If it is not the profile of object that edgeSqr, illustrates present picture element so less than threshold values n.For the pixel that is contour of object, output on the texture image Edge with the value of the R passage (brightness) of this pixel, for the pixel of contour of object, be 0 to output on the texture image Edge with the value of the R passage of this pixel.
After finishing the sampling processing to floating-point texture image I mage, the texture image Edge of formation is shown in Fig. 3 C.
Three, second processing unit 22 is drawn the rectangle frame identical with screen size, and the background texture image paper of texture image Edge and selection is sampled, and handles in PixelShader:
A, handle, texture image Edge is carried out various sampling, as carry out 28 samplings by pixel.
One section sample code is as follows:
for(int?i=0;i<28;i++)
{
sum+=tex2D(Edge,texCoord+dis×samples[i]);
} // texture image Edge is carried out 28 samplings, and the color value addition.
Wherein, Samples[n] side-play amount of the texture coordinate that stores, dis be the correction as the side-play amount of texture coordinate, dis increase a little can make the frame line chap, but density can descend, with remedying that previous step obtains than closeer frame line carefully; Dis * samples[i] expression is for the side-play amount of the position of this pixel.
B, according to the location of pixels among the texture image Edge, the pixel of relevant position among the background texture image paper shown in Fig. 3 D is sampled, and mixes with sum.
colorFinal=min((1-hot*sum),paper);
Getting the minimum value (the most black) of R passage in the texture color of all samplings exports as color.
Final effect figure is shown in Fig. 3 E.This figure is at the 1.5G of Celeron, and Raden9700se moves 512 * 512 sizes on the computer platform of 512M internal memory, and frame per second fps is about 100.
Because said method is based on the processing of 2D texture space, and employing ShaderModel2.0, PixelShader is based on that pixel operates, and it doesn't matter so the time complexity of handling is with the model vertices number, with the size of the interim texture of each output relation is arranged.
Embodiment two
Present embodiment increases and carries out the processing of the crude effect of stroke in implementing a described ink render process.
Consult shown in Figure 4ly, it is as follows in the present embodiment object model to be carried out the main process that ink and wash style 3D plays up:
In this step, the depth value of object is outputed on the AGB passage of floating-point texture, the brightness value of object is outputed on the R passage of floating-point texture.
When the pixel in the texture image that obtains in the step 410 was sampled, except the sampling current pixel, also the position with present picture element was repeatedly to sample to pixel on every side in the center.
Consult shown in Figure 5, realize in the present embodiment that the computing machine that ink and wash style 3D plays up has the unit of basic function now except comprising realization, outside central processing unit 10, storage unit 11 etc., also further comprise rendering unit 20, first processing unit 21, second processing unit 22 and the 3rd processing unit 23.
Described rendering unit 20 is used for model rendering with object to the floating-point texture image; Described first processing unit 21 be used to sample on the described floating-point texture image each pixel and be defined as the pixel of contour of object, to form the texture image of outstanding contour of object; Each pixel in the texture image of the described outstanding contour of object of described the 3rd processing unit 23 samplings, and according to each locations of pixels from the correspondence position sampled pixel of noise texture image and with mix from the respective pixel of second texture image sampling, have the crude effect of stroke to form described outstanding contour of object; Pixel in the texture image of the described outstanding contour of object of described second processing unit, 22 samplings, and according to each locations of pixels from the texture image with the crude effect of stroke and the correspondence position sampled pixel of background texture image, to mix from the respective pixel of three texture image samplings, and output on the described background texture image to present the object model of ink and wash style.
Playing up with the pattern coloring language ShaderModel2.0 realization of Microsoft below is that example is elaborated.
(1), among rendering module 20 processing procedure that forms floating-point texture image I mage and the embodiment one identical (repeating no more).
(2), among first processing module 21 processing procedure that forms texture image Edge and the embodiment one identical (repeating no more).
(3), the 3rd processing module 23 draws the rectangle frame identical with screen size, and the noise texture image noise of texture image Edge and selection is sampled, in PixelShader, handle:
A, texture image Edge is pursued pixel sampling handle.When adopting current pixel, be the center further also with the current pixel position, pixel is on every side carried out 31 samplings (PixelShader2.0 supports texture maximum 32 these samplings).
Sampling illustrates that as shown in Figure 6A wherein except that the present picture element of centre, all the other 31 texture samplings (black) disperse to distribute, and this is equivalent to artist's style of writing in fact.
One section sample code is as follows:
for(int?i=0;i<31;i++)
{
sum+=tex2D(Edge,texCoord+samples[i]);
} // pixel is carried out 31 samplings also the color value addition
Wherein: the Samples[n] side-play amount of the texture coordinate of Chu Cuning.
B, according to the location of pixels among the texture image Edge, the pixel of relevant position among the noise texture image noise is sampled and is mixed with sum, to reach the crude effect of stroke.
ColorFinal=(1-hot*sum)+noise/mix;
Wherein, hot is used for revising the brightness of output pixel, and mix outputs to texture image temp in order to revise the influence degree of noise texture to color with color.
After finishing the sampling processing to texture image Image and noise texture image noise, the interim texture image temp of formation is shown in Fig. 6 B.
(4) second processing units 22 are drawn the rectangle frame identical with screen size, and the background texture image paper of texture image Edge, texture image temp and selection is temporarily sampled, and handle in PixelShader:
A, handle, texture image Edge is carried out various sampling, as carry out 28 samplings by pixel.
One section sample code is as follows:
for(int?i=0;i<28;i++)
{
sum+=tex2D(Edge,texCoord+dis×samples[i]);
}
Wherein, Samples[n] side-play amount of the texture coordinate that stores, dis be the correction as the side-play amount of texture coordinate, dis increase a little can make the frame line chap, but density can descend, with remedying that previous step obtains than closeer frame line carefully.
B, according to the location of pixels among the texture image Edge, interim texture image temp and background texture image paper (shown in Fig. 3 D) are sampled, and mix with sum:.
Finalcolor=min(min((1-hot*sum)+noise/mix,temp),paper);
Getting the minimum value (the most black) of R passage in the texture color of all samplings exports as color.
Final effect figure is shown in Fig. 6 C.
At present embodiment, if do not require the effect that stroke is crude, when forming image texture Edge, then noise texture image noise is not sampled, the image texture Edge of Xing Chenging has only the style of writing effect like this.Accordingly, each pixel sampling in device in the texture image of the described outstanding contour of object of the 3rd processing unit 23 samplings, except the sampling current pixel, also the position with present picture element is repeatedly to sample to pixel on every side in the center, to form interim texture image temp; 22 couples of image texture Edge of described second processing unit pursue pixel sampling, and according to the relevant position sampled pixel of locations of pixels from interim texture image temp and background texture image paper, and mix with sum.All the other processing procedures in like manner repeat no more.
Obviously, those skilled in the art can carry out various changes and modification to the present invention and not break away from the spirit and scope of the present invention.Like this, if of the present invention these are revised and modification belongs within the scope of claim of the present invention and equivalent technologies thereof, then the present invention also is intended to comprise these changes and modification interior.
Claims (10)
1, a kind of method that realizes wash painting style is characterized in that, handles by the 2D texture space, and this method comprises the steps:
A, with the model rendering of object to the first floating-point texture image;
Each pixel on B, the described first floating-point texture image of sampling also determines wherein to be the pixel of contour of object, to form second texture image of outstanding contour of object;
Each pixel of C, described second texture image of sampling, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image to present the object model of ink and wash style.
2, the method for claim 1 is characterized in that, in steps A, further illumination is carried out to model in the selective light source position.
3, the method for claim 1 is characterized in that, among the step B, sampling during current pixel simultaneously to around pixel sample, determine according to the intensity of variation of the depth value of the pixel around described whether this current pixel is in the outline position of object.
4, method as claimed in claim 3 is characterized in that, among the step B, before the sampling preceding pixel, further the depth value to model carries out filtration treatment.
5, as each described method of claim 1 to 4, it is characterized in that, before step C, also comprise step:
Each pixel of B1, described second texture image of sampling, and except the sampling current pixel, also the position with present picture element is repeatedly to sample to pixel on every side in the center, to form interim texture image;
And, in step C, also mix from the correspondence position sampled pixel of interim texture image and from the respective pixel of second texture image and background texture image sampling according to each locations of pixels.
6, method as claimed in claim 5, it is characterized in that, among the step B1, also further mix from the correspondence position sampled pixel of noise texture image and with the respective pixel of sampling, make contour of object outstanding in the interim texture image have the crude effect of stroke from second texture image according to each locations of pixels in second texture image.
7, method as claimed in claim 5 is characterized in that, among the step C, with the color minimum depth value in the pixel of current sampling in second texture image and the background texture image as mixed color of pixel value.
8, a kind of device of realizing wash painting style is characterized in that, comprising:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and determine the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
9, a kind of device of realizing wash painting style is characterized in that, comprising:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and be defined as the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and except the sampling current pixel, also with the position of present picture element be the center to around pixel repeatedly sample, to form the unit of texture image;
Sample each pixel of second texture image, and according to each locations of pixels from the correspondence position sampled pixel of texture image and background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
10, device as claimed in claim 9 is characterized in that, this device further comprises:
Mix the unit that makes contour of object outstanding in the texture image have the crude effect of stroke according to each locations of pixels in second texture image from the correspondence position sampled pixel of noise texture image and with the respective pixel of sampling from second texture image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB2006100575998A CN100463003C (en) | 2006-03-16 | 2006-03-16 | Method and apparatus for implementing wash painting style |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB2006100575998A CN100463003C (en) | 2006-03-16 | 2006-03-16 | Method and apparatus for implementing wash painting style |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101038675A true CN101038675A (en) | 2007-09-19 |
CN100463003C CN100463003C (en) | 2009-02-18 |
Family
ID=38889558
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB2006100575998A Active CN100463003C (en) | 2006-03-16 | 2006-03-16 | Method and apparatus for implementing wash painting style |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN100463003C (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102087750A (en) * | 2010-06-13 | 2011-06-08 | 湖南宏梦信息科技有限公司 | Method for manufacturing cartoon special effect |
CN103116898A (en) * | 2013-01-30 | 2013-05-22 | 深圳深讯和科技有限公司 | Method and device for generating images in ink and wash painting style |
CN103218846A (en) * | 2013-04-16 | 2013-07-24 | 西安理工大学 | Ink painting simulation method of three-dimensional tree model |
CN103685858A (en) * | 2012-08-31 | 2014-03-26 | 北京三星通信技术研究有限公司 | Real-time video processing method and equipment |
CN104463847A (en) * | 2014-08-05 | 2015-03-25 | 华南理工大学 | Ink and wash painting characteristic rendering method |
CN104658030A (en) * | 2015-02-05 | 2015-05-27 | 福建天晴数码有限公司 | Secondary image mixing method and apparatus |
CN107045729A (en) * | 2017-05-05 | 2017-08-15 | 腾讯科技(深圳)有限公司 | A kind of image rendering method and device |
CN107481200A (en) * | 2017-07-31 | 2017-12-15 | 腾讯科技(深圳)有限公司 | Image processing method and device |
WO2018177112A1 (en) * | 2017-03-30 | 2018-10-04 | 腾讯科技(深圳)有限公司 | Object rendering method, device, storage medium, and electronic device |
CN109816755A (en) * | 2019-02-02 | 2019-05-28 | 珠海金山网络游戏科技有限公司 | A kind of production method of ink image, calculates equipment and storage medium at device |
CN109993822A (en) * | 2019-04-10 | 2019-07-09 | 阿里巴巴集团控股有限公司 | A kind of wash painting style method and apparatus |
CN112070873A (en) * | 2020-08-26 | 2020-12-11 | 完美世界(北京)软件科技发展有限公司 | Model rendering method and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6943805B2 (en) * | 2002-06-28 | 2005-09-13 | Microsoft Corporation | Systems and methods for providing image rendering using variable rate source sampling |
CN1414517A (en) * | 2002-09-05 | 2003-04-30 | 何云 | Manufacturing method of computer wash painting cartoon |
JP4199170B2 (en) * | 2004-07-20 | 2008-12-17 | 株式会社東芝 | High-dimensional texture mapping apparatus, method and program |
-
2006
- 2006-03-16 CN CNB2006100575998A patent/CN100463003C/en active Active
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102087750A (en) * | 2010-06-13 | 2011-06-08 | 湖南宏梦信息科技有限公司 | Method for manufacturing cartoon special effect |
CN103685858A (en) * | 2012-08-31 | 2014-03-26 | 北京三星通信技术研究有限公司 | Real-time video processing method and equipment |
CN103116898A (en) * | 2013-01-30 | 2013-05-22 | 深圳深讯和科技有限公司 | Method and device for generating images in ink and wash painting style |
CN103116898B (en) * | 2013-01-30 | 2016-01-06 | 深圳深讯和科技有限公司 | Generate method and the device of ink and wash style image |
CN103218846A (en) * | 2013-04-16 | 2013-07-24 | 西安理工大学 | Ink painting simulation method of three-dimensional tree model |
CN103218846B (en) * | 2013-04-16 | 2016-06-01 | 西安理工大学 | The ink and wash analogy method of Three-dimension Tree model |
CN104463847A (en) * | 2014-08-05 | 2015-03-25 | 华南理工大学 | Ink and wash painting characteristic rendering method |
CN104658030B (en) * | 2015-02-05 | 2018-08-10 | 福建天晴数码有限公司 | The method and apparatus of secondary image mixing |
CN104658030A (en) * | 2015-02-05 | 2015-05-27 | 福建天晴数码有限公司 | Secondary image mixing method and apparatus |
WO2018177112A1 (en) * | 2017-03-30 | 2018-10-04 | 腾讯科技(深圳)有限公司 | Object rendering method, device, storage medium, and electronic device |
CN107045729B (en) * | 2017-05-05 | 2018-09-18 | 腾讯科技(深圳)有限公司 | A kind of image rendering method and device |
CN107045729A (en) * | 2017-05-05 | 2017-08-15 | 腾讯科技(深圳)有限公司 | A kind of image rendering method and device |
CN107481200A (en) * | 2017-07-31 | 2017-12-15 | 腾讯科技(深圳)有限公司 | Image processing method and device |
CN109816755A (en) * | 2019-02-02 | 2019-05-28 | 珠海金山网络游戏科技有限公司 | A kind of production method of ink image, calculates equipment and storage medium at device |
CN109993822A (en) * | 2019-04-10 | 2019-07-09 | 阿里巴巴集团控股有限公司 | A kind of wash painting style method and apparatus |
CN109993822B (en) * | 2019-04-10 | 2023-02-21 | 创新先进技术有限公司 | Ink and wash style rendering method and device |
CN112070873A (en) * | 2020-08-26 | 2020-12-11 | 完美世界(北京)软件科技发展有限公司 | Model rendering method and device |
CN112070873B (en) * | 2020-08-26 | 2021-08-20 | 完美世界(北京)软件科技发展有限公司 | Model rendering method and device |
Also Published As
Publication number | Publication date |
---|---|
CN100463003C (en) | 2009-02-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101038675A (en) | Method and apparatus for implementing wash painting style | |
CN2686241Y (en) | Image processing system and projector | |
CN1203453C (en) | Automatic coloring of exposed pixels during zone processing of image | |
CN1143245C (en) | Image processing apparatus and method | |
CN1222911C (en) | Image processing apparatus and method, and providing medium | |
CN1317681C (en) | Texturing method and apparatus | |
CN1217526C (en) | Image processing device, image processing method, image processing program and its recording medium | |
CN1741039A (en) | Face organ's location detecting apparatus, method and program | |
CN1750046A (en) | Three-dimensional ink and wash effect rendering method based on graphic processor | |
CN1678016A (en) | Image processing apparatus and image processing method | |
US20110205236A1 (en) | Image processing apparatus and storage medium having stored therein an image processing program | |
CN1735173A (en) | Display apparatus and image information generating method adapted to display apparatus | |
CN1343341A (en) | Image forming method and apparatus | |
CN1924899A (en) | Precise location method of QR code image symbol region at complex background | |
CN1713068A (en) | Automatic focus adjustment for projector | |
CN1928889A (en) | Image processing apparatus and method | |
CN1719455A (en) | Segmentation technique of a color image according to colors | |
CN1132887A (en) | Method of producing image data, image data processing apparatus, and recording medium | |
CN1700779A (en) | Display evaluation method and apparatus | |
CN1932885A (en) | Three-dimensional image processing | |
CN1484434A (en) | Method of pantography according to scale for digital image in embedded system | |
CN1842134A (en) | Noise reducing apparatus and noise reducing method | |
CN1130667C (en) | Device and method for controlling quality of reproduction of motion picture | |
CN100339870C (en) | Anti-aliasing line pixel coverage calculation using programmable shader | |
CN1885296A (en) | Method for drawing map in game |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |