CN100463003C - Method and apparatus for implementing wash painting style - Google Patents

Method and apparatus for implementing wash painting style Download PDF

Info

Publication number
CN100463003C
CN100463003C CNB2006100575998A CN200610057599A CN100463003C CN 100463003 C CN100463003 C CN 100463003C CN B2006100575998 A CNB2006100575998 A CN B2006100575998A CN 200610057599 A CN200610057599 A CN 200610057599A CN 100463003 C CN100463003 C CN 100463003C
Authority
CN
China
Prior art keywords
texture image
pixel
sampling
pixels
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CNB2006100575998A
Other languages
Chinese (zh)
Other versions
CN101038675A (en
Inventor
刘皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CNB2006100575998A priority Critical patent/CN100463003C/en
Publication of CN101038675A publication Critical patent/CN101038675A/en
Application granted granted Critical
Publication of CN100463003C publication Critical patent/CN100463003C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method for realizing an ink-wash style rendering method which processes via a 2D texture space, the method comprises rendering a model of an object to a first floating-point texture image; sampling respective pixels on the first floating-point texture image and determining the pixels as pixels of an outline of the objcet so as to form a second texture image projecting from the outline of the objcet; sampling respective pixels on the second texture image, sampling the pixels from positions corresponding to a background texture image according to positions of the respective pixels, and mixing the pixels with the corresponding pixels sampled from the second texture image, outputting the mixed pixels to the background texture image to present the model of the object with an ink-wash style. The invention also discloses a device for realizing ink-wash style rendering.

Description

Realize the method and the device of wash painting style
Technical field
The present invention relates to field of computer technology, relate in particular to a kind of method and device of realizing wash painting style.
Background technology
Computing machine to object model play up can be divided into the photo live stage play up (PR, PhotorealisticRendering) and non-photo live stage play up (NPR, Non-Photorealistic Rendering).PR is meant the research to the actual physical shadow, makes the rendering effect of real world; NPR is meant that the research artist sees through the abstract artistic conception that paintbrush is sketched the contours of, and develops more and ripe mostly relevantly with western paintings at present, and as sketch, pen, carbon pencil is drawn, watercolor, cartoon etc.Existing in fact: the DEMO of ATI (water color rendering), the recreation on the PS2: as Dragon Quest 8, Teenage Mutant Ninja Turtle series, Long Zhu etc. (cartoon is played up).
At present mainly based on physical space, realize the crochet and the stroke of ink and wash style with the central processing unit (CPU) of computing machine, to reach the rendering effect of inkization, its main flow process is as follows:
1, searches object model silhouette edge (handling by CPU) based on the 3D geometric space;
2, visuality is carried out on the silhouette edge summit and reject (pinning Z-buffer pursues the silhouette edge summit and relatively rejects);
3, connect the silhouette edge summit and form stroke (CPU pursues visual profile limit vertex position relatively, carries out stroke and connects);
4, on stroke, wrap artist's style of writing (to have the texture of Alpha passage, be attached to a triangle on be used for producing style of writing as pinup picture).
Though adopt said method can produce the rendering effect of ink and wash style, have following shortcoming:
1, carry out a series of crucial the processing owing to be based on CPU, so very high to the CPU requirement, not only speed is slow on the low computer of cpu performance, and poor effect.
2, it is relevant with the model of object to handle complexity, if model vertices is few, then can be with higher frame per second (fps, i.e. p.s. how many frames) operation, if but model vertices more then processing is very difficult.
3, the inkization that is only applicable to indivedual main simple models in the scene is played up, and the inkization that is not suitable for whole scene is played up.
Summary of the invention
The invention provides a kind of method that realizes wash painting style, exist hardware device performance requirement height when the inkization that realizes object model in the prior art is played up to solve, and can only carry out the problem that inkization is played up naive model.
The invention provides following technical scheme:
A kind of method that realizes wash painting style is handled by the 2D texture space, and this method comprises the steps:
A, with the model rendering of object to the first floating-point texture image;
Each pixel on B, the described first floating-point texture image of sampling also determines wherein to be the pixel of contour of object, to form second texture image of outstanding contour of object;
Each pixel of C, described second texture image of sampling, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image to present the object model of ink and wash style.
Wherein:
In the render process in steps A, further illumination is carried out to model in the selective light source position.
Among the step B, when the pixel of sampling on the described first floating-point texture image simultaneously to around pixel sample, determine according to the intensity of variation of the depth value of the pixel around described whether this current pixel is in the outline position of object.
Comprise also between steps A and the B that the depth value to model carries out filtration treatment.
Among the step C, after each pixel of sampling second texture image, before the correspondence position sampled pixel of background texture image, also comprising step: with each locations of pixels of second texture image that samples be the center to around pixel repeatedly sample, to form interim texture image;
Among the step C, after the correspondence position sampled pixel of background texture image, will from the pixel of background texture image sampling with mixes from the respective pixel of second texture image sampling before also comprise step: according to each locations of pixels of described second texture image from the correspondence position sampled pixel of interim texture image and from the respective pixel mixing of second texture image and background texture image sampling.
Among the step C, after forming interim texture image, also comprising step before the correspondence position sampled pixel of background texture image: each locations of pixels is mixed from the correspondence position sampled pixel of noise texture image and with the respective pixel of sampling from second texture image according to second texture image that samples, and makes contour of object outstanding in the interim texture image have the crude effect of stroke.
Among the step C, with the color minimum depth value in the pixel of current sampling in second texture image and the background texture image as mixed color of pixel value.
A kind of device of realizing wash painting style comprises:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and determine the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
A kind of device of realizing wash painting style comprises:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and be defined as the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and according to each locations of pixels from the correspondence position sampled pixel of noise texture image and with mix from the respective pixel of second texture image sampling, to form the unit that described outstanding contour of object has the texture image of the crude effect of stroke;
Each pixel of the texture of sampling image, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
Because the present invention handles based on pixel at the 2D texture space, it doesn't matter for the time complexity of processing and model vertices number, relevant with the size of the interim texture of each output.Therefore, on the lower computing machine of cpu performance, also can finish processing faster, also be suitable for processing complex model.
Description of drawings
Fig. 1 is for realizing the process flow diagram that inkization is played up in the embodiment of the invention one;
Fig. 2 realizes rendering with water and ink apparatus structure synoptic diagram in the embodiment of the invention one;
Fig. 3 A is to be the synoptic diagram of floating-point texture image with model rendering in the embodiment of the invention one;
Fig. 3 B is the synoptic diagram that in the embodiment of the invention one model among Fig. 3 A is carried out illumination;
Fig. 3 C is the texture image synoptic diagram of outstanding contour of object in the embodiment of the invention one;
Fig. 3 D is a background texture image synoptic diagram in the embodiment of the invention one;
Fig. 3 E is the final effect figure that carries out in the embodiment of the invention one after inkization is played up;
Fig. 4 is for realizing the process flow diagram that inkization is played up in the embodiment of the invention two;
Fig. 5 realizes rendering with water and ink apparatus structure synoptic diagram in the embodiment of the invention two;
Fig. 6 A is the synoptic diagram of sampled pixel in the embodiment of the invention two;
Fig. 6 B is the texture image synoptic diagram that the contour of object stroke has crude effect in the embodiment of the invention two;
Fig. 6 C is the final effect figure that carries out in the embodiment of the invention two after inkization is played up.
Embodiment
To the time complexity of requirements on hardware equipment and processing, to the present invention is based on the 2D texture space and come that object model is carried out ink and wash style 3D and play up when to realize that in order reducing inkization is played up.
Embodiment one
Consult shown in Figure 1ly, it is as follows in the present embodiment object model to be carried out the main process that ink and wash style 3D plays up:
On step 100, the floating-point texture image with the model rendering to 64 (being not limited to 64) of object (A16B16G16R16F, 64 altogether of expression textures, A, B, G, a R4 passage, 16 in each passage is stored with relocatable).
In this step, the depth value of object is outputed on A, G, the B passage of floating-point texture, the brightness value of object is outputed on the R passage of floating-point texture.
Step 110, the pixel in the floating-point texture image that obtains in the step 100 is carried out individual element sampling, in the pixel that samples, be defined as the pixel of contour of object, and the pixel of contour of object and the pixel of non-profile outputed on the texture image with different brightness values, to form the texture image of outstanding contour of object.
Step 120, the pixel in the texture image that obtains in the step 110 is carried out individual element sampling, simultaneously, according to the correspondence position sampled pixel of each locations of pixels from the background texture image of selection, the respective pixel that to sample from two texture images is mixed, and outputs on the described background texture image to present the object model of ink and wash style.
Consult shown in Figure 2, realize in the present embodiment that the computing machine that ink and wash style 3D plays up has the unit of basic function now except comprising realization, outside central processing unit 10, storage unit 11 etc., also further comprise rendering unit 20, first processing unit 21 and second processing unit 22.
Described rendering unit 20 is used for model rendering with object to the floating-point texture image; Described first processing unit 21 be used to sample on the described floating-point texture image each pixel and be defined as the pixel of contour of object, to form the texture image of outstanding contour of object; Each pixel in the texture image of the described outstanding contour of object of described second processing unit, 22 samplings, and according to the correspondence position sampled pixel of each locations of pixels from the background texture image of selection, the respective pixel that to sample from two texture images is mixed, and outputs on the described background texture image to present the object model of ink and wash style.
Playing up with the pattern coloring language ShaderModel2.0 realization of Microsoft below is that example is elaborated.ShaderModel2.0 comprises VertexShader2.0 and PixelShader2.0, have more than the FX5200 of Nvidia, or the computing machine of the above hardware of Radeon9500 of Ati can be supported ShaderModel2.0.
One, by rendering unit 20 object model (Model) is rendered into one 64 floating-point texture image I mage last (A16B16G16R16F), as shown in Figure 3A.In the render process, in PixelShader, further carry out following processing:
A, the depth value of object is outputed on A, G, the B passage of floating-point texture.The depth value of the bright more representative object of object brightness is big more.
B, as the position of light source model is carried out illumination with the position of two points among Fig. 3 B:
Color0=light0 * normal; Color after color0//first light source (light0) illumination;
Color1=light1 * normal; Color after color1//secondary light source (lightl) illumination;
Colorfinal=max (color0, color1) // get two maximal values in the color;
Colorfinal is outputed to the R passage of floating-point texture.
Two, first processing unit 21 is drawn the rectangle frame identical with screen size, and floating-point texture Image is sampled, and handles in PixelShader:
A, the horizontal and vertical filtrator of employing filter the A channel of floating-point texture image I mage.Because the value of A, R, G passage is identical,, can certainly filter A, R, G passage respectively so only need filter to A channel.
B, texture image Image is pursued pixel sampling and handles, and 8 pixels around this pixel are also sampled.
SobelX SobelY Sobel
1 0 -1   1 ?2? 1 s00 s01 s02
2 0 -2   0? 0 ?0 s10 s11 s12
1 0 -1   -1 -2 -1 s20 s21 s22
float?sobelX=s00+2*s10+s20-s02-2*s12-s22;
float?sobelY=s00+2*s01+s02-s20-2*s21-s22;
edgeSqr=(sobelX*sobelX+sobelY*sobelY)。
Wherein, SobelX represents directions X is filtered the weights of each pixel that is filled into of back, and SobelY represents the Y direction is filtered the weights of each pixel that is filled into of back, and Sobel represents the position (grid of 3*3) of the filtered pixel of whole filtrator; S11 represents current pixel, and edgeSqr represents the fierce degree of the variation of this pixel surrounding pixel value.
(n generally gets 0.005 if edgeSqr is greater than the threshold values n that sets, can suitably finely tune), the value that current pixel pixel on every side then is described changes fierceness, and the value of pixel is the depth information of remarked pixel, therefore the change in depth fierceness of current pixel surrounding pixel also is described, can determines that then current pixel is the profile of object.If it is not the profile of object that edgeSqr, illustrates current pixel so less than threshold values n.For the pixel that is contour of object, output on the texture image Edge with the value of the R passage (brightness) of this pixel, for not being the pixel of contour of object, be 0 to output on the texture image Edge with the value of the R passage of this pixel.
After finishing the sampling processing to floating-point texture image I mage, the texture image Edge of formation is shown in Fig. 3 C.
Three, second processing unit 22 is drawn the rectangle frame identical with screen size, and the background texture image paper of texture image Edge and selection is sampled, and handles in PixelShader:
A, by processes pixel, texture image Edge is carried out various sampling, as carries out 28 samplings.
One section sample code is as follows:
for(int?i=0;i<28;i++)
{
sum+=tex2D(Edge,texCoord+dis×samples[i]);
} // texture image Edge is carried out 28 samplings, and the color value addition.
Wherein, Samples[n] side-play amount of the texture coordinate that stores, dis be the correction as the side-play amount of texture coordinate, dis increase a little can make the frame line chap, but density can descend, with remedying that previous step obtains than closeer frame line carefully; Dis * samples[i] expression is for the side-play amount of this locations of pixels.
B, according to the location of pixels among the texture image Edge, the pixel of relevant position among the background texture image paper shown in Fig. 3 D is sampled, and mixes with sum.
colorFinal=min((1-hot*sum),paper);
Getting the minimum value (the most black) of R passage in the texture color of all samplings exports as color.
Final effect figure is shown in Fig. 3 E.This figure is at the 1.5G of Celeron, and Raden9700se moves on the computer platform of 512M internal memory, and the size of this figure is 512 * 512, and frame per second fps is about 100.
Because said method is based on the processing of 2D texture space, and employing ShaderModel2.0, PixelShader is based on that pixel operates, and it doesn't matter so the time complexity of handling is with the model vertices number, with the size of the interim texture of each output relation is arranged.
Embodiment two
Present embodiment increases and carries out the processing of the crude effect of stroke in implementing a described ink render process.
Consult shown in Figure 4ly, it is as follows in the present embodiment object model to be carried out the main process that ink and wash style 3D plays up:
Step 400, with on the floating-point texture image of the model rendering to 64 (being not limited to 64) of object (A16B16G16R16F).
In this step, the depth value of object is outputed on the AGB passage of floating-point texture, the brightness value of object is outputed on the R passage of floating-point texture.
Step 410, the pixel in the floating-point texture image that obtains in the step 400 is carried out individual element sampling, in the pixel that samples, be defined as the pixel of contour of object, and the pixel of contour of object and the pixel of non-profile outputed on the texture image with different brightness values, to form the texture image of outstanding contour of object.
Step 420, the pixel in the texture image that obtains in the step 410 is carried out individual element sampling, simultaneously, according to the correspondence position sampled pixel of each locations of pixels from the noise texture image of selection, the respective pixel that to sample from two texture images is mixed, and outputs on the interim texture image and have the crude effect of stroke to form described outstanding contour of object.
When the pixel in the texture image that obtains in the step 410 was sampled, except the sampling current pixel, also the position with current pixel was repeatedly to sample to pixel on every side in the center.
Step 430, the pixel in the texture image that obtains in the step 410 is carried out individual element sampling, simultaneously, on the interim texture image that obtains from step 420 according to each locations of pixels and the correspondence position sampled pixel of the background texture image of selecting, the respective pixel that to sample from three texture images is mixed, and outputs on the described background texture image to present the object model of ink and wash style.
Consult shown in Figure 5, realize in the present embodiment that the computing machine that ink and wash style 3D plays up has the unit of basic function now except comprising realization, outside central processing unit 10, storage unit 11 etc., also further comprise rendering unit 20, first processing unit 21, second processing unit 22 and the 3rd processing unit 23.
Described rendering unit 20 is used for model rendering with object to the floating-point texture image; Described first processing unit 21 be used to sample on the described floating-point texture image each pixel and be defined as the pixel of contour of object, to form the texture image of outstanding contour of object; Each pixel in the texture image of the described outstanding contour of object of described the 3rd processing unit 23 samplings, and mix from the correspondence position sampled pixel of noise texture image and with the respective pixel of from the texture image of outstanding contour of object, sampling according to each locations of pixels, have the crude effect of stroke to form described outstanding contour of object; Pixel in the texture image of the described outstanding contour of object of described second processing unit, 22 samplings, and according to each locations of pixels from the texture image with the crude effect of stroke and the correspondence position sampled pixel of background texture image, to mix from the respective pixel of three texture image samplings, and output on the described background texture image to present the object model of ink and wash style.
Playing up with the pattern coloring language ShaderModel2.0 realization of Microsoft below is that example is elaborated.
(1), among rendering module 20 processing procedure that forms floating-point texture image I mage and the embodiment one identical (repeating no more).
(2), among first processing module 21 processing procedure that forms texture image Edge and the embodiment one identical (repeating no more).
(3), the 3rd processing module 23 draws the rectangle frame identical with screen size, and the noise texture image noise of texture image Edge and selection is sampled, in PixelShader, handle:
A, texture image Edge is pursued pixel sampling handle.When adopting current pixel, be the center further also with the current pixel position, pixel is on every side carried out 31 samplings (PixelShader2.0 supports maximum 32 samplings of texture).
Sampling illustrates that as shown in Figure 6A wherein except that the current pixel of centre, all the other 31 texture samplings (black) disperse to distribute, and this is equivalent to artist's style of writing in fact.
One section sample code is as follows:
for(inti=0;i<31;i++)
{
sum+=tex2D(Edge,texCoord+samples[i]);
} // pixel is carried out 31 samplings also the color value addition
Wherein: the Samples[n] side-play amount of the texture coordinate of Chu Cuning.
B, according to the location of pixels among the texture image Edge, the pixel of relevant position among the noise texture image noise is sampled and is mixed with sum, to reach the crude effect of stroke.
ColorFinal=(1-hot*sum)+noise/mix;
Wherein, hot is used for revising the brightness of output pixel, and mix outputs to texture image temp in order to revise the influence degree of noise texture to color with color.
After finishing the sampling processing to texture image Image and noise texture image noise, the interim texture image temp of formation is shown in Fig. 6 B.
(4) second processing units 22 are drawn the rectangle frame identical with screen size, and the background texture image paper of texture image Edge, texture image temp and selection is temporarily sampled, and handle in PixelShader:
A, by processes pixel, texture image Edge is carried out various sampling, as carries out 28 samplings.
One section sample code is as follows:
for(int?i=0;i<28;i++)
{
sum+=tex2D(Edge,texCoord+dis×samples[i]);
}
Wherein, Samples[n] side-play amount of the texture coordinate that stores, dis be the correction as the side-play amount of texture coordinate, dis increase a little can make the frame line chap, but density can descend, with remedying that previous step obtains than closeer frame line carefully.
B, according to the location of pixels among the texture image Edge, interim texture image temp and background texture image paper (shown in Fig. 3 D) are sampled, and mix with sum.
Finalcolor=min(min((1-hot*sum)+noise/mix,temp),paper);
Getting the minimum value (the most black) of R passage in the texture color of all samplings exports as color.
Final effect figure is shown in Fig. 6 C.
At present embodiment, if do not require the effect that stroke is crude, when forming image texture Edge, then noise texture image noise is not sampled, the image texture Edge of Xing Chenging has only the style of writing effect like this.Accordingly, each pixel sampling in device in the texture image of the described outstanding contour of object of the 3rd processing unit 23 samplings, except the sampling current pixel, also the position with current pixel is repeatedly to sample to pixel on every side in the center, to form interim texture image temp; 22 couples of image texture Edge of described second processing unit pursue pixel sampling, and according to the relevant position sampled pixel of locations of pixels from interim texture image temp and background texture image paper, and mix with sum.All the other processing procedures in like manner repeat no more.
Obviously, those skilled in the art can carry out various changes and modification to the present invention and not break away from the spirit and scope of the present invention.Like this, if of the present invention these are revised and modification belongs within the scope of claim of the present invention and equivalent technologies thereof, then the present invention also is intended to comprise these changes and modification interior.

Claims (10)

1. a method that realizes wash painting style is characterized in that, handles by the 2D texture space, and this method comprises the steps:
A, with the model rendering of object to the first floating-point texture image;
Each pixel on B, the described first floating-point texture image of sampling also determines wherein to be the pixel of contour of object, to form second texture image of outstanding contour of object;
Each pixel of C, described second texture image of sampling, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image to present the object model of ink and wash style.
2. the method for claim 1 is characterized in that, in the render process in steps A, further illumination is carried out to model in the selective light source position.
3. the method for claim 1, it is characterized in that, among the step B, when each current pixel on the first floating-point texture image of sampling, simultaneously the pixel around this current pixel is sampled, determine according to the intensity of variation of the depth value of the pixel around described whether this current pixel is in the outline position of object.
4. the method for claim 1 is characterized in that, comprises also that between described steps A and B the depth value to model carries out filtration treatment.
5. as each described method of claim 1 to 4, it is characterized in that, among the described step C, after each pixel of sampling second texture image, before the correspondence position sampled pixel of background texture image, also comprising step:
With each locations of pixels of second texture image that samples be the center to around pixel repeatedly sample, to form interim texture image;
And, among the described step C, after the correspondence position sampled pixel of background texture image, will from the pixel of background texture image sampling with mix from the respective pixel of second texture image sampling before also comprise step:
According to each locations of pixels of described second texture image correspondence position sampled pixel from interim texture image, and with mix from the respective pixel of second texture image and background texture image sampling.
6. method as claimed in claim 5 is characterized in that, among the described step C, after forming interim texture image, is also comprising step before the correspondence position sampled pixel of background texture image:
, and mixes from the correspondence position sampled pixel of noise texture image according to each locations of pixels in second texture image that samples, make the contour of object of giving prominence in the interim texture image have the crude effect of stroke with respective pixel from second texture image sampling.
7. method as claimed in claim 5 is characterized in that, among the step C, with the color minimum depth value in the pixel of current sampling in second texture image and the background texture image as mixed color of pixel value.
8. a device of realizing wash painting style is characterized in that, comprising:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and determine the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and according to each locations of pixels from the correspondence position sampled pixel of background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
9. a device of realizing wash painting style is characterized in that, comprising:
With the model rendering of the object unit to the first floating-point texture image;
Sample on the described first floating-point texture image each pixel and be defined as the pixel of contour of object, with the unit of second texture image that forms outstanding contour of object;
Sample each pixel of described second texture image, and except the sampling current pixel, also with the position of current pixel be the center to around pixel repeatedly sample, to form the unit of texture image;
Sample each pixel of second texture image, and according to each locations of pixels from the correspondence position sampled pixel of texture image and background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image unit with the object model that presents ink and wash style.
10. device as claimed in claim 9 is characterized in that, this device further comprises:
After the unit of described texture image forms the texture image, mix the unit that makes contour of object outstanding in the texture image have the crude effect of stroke according to each locations of pixels in second texture image from the correspondence position sampled pixel of noise texture image and with the respective pixel of sampling from second texture image;
Each pixel of unit sampling second texture image of the described object model that presents ink and wash style, and according to each locations of pixels from correspondence position sampled pixel with the crude effect of stroke texture image and background texture image and with mix from the respective pixel of second texture image sampling, mixed pixel is outputed on the described background texture image.
CNB2006100575998A 2006-03-16 2006-03-16 Method and apparatus for implementing wash painting style Active CN100463003C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2006100575998A CN100463003C (en) 2006-03-16 2006-03-16 Method and apparatus for implementing wash painting style

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2006100575998A CN100463003C (en) 2006-03-16 2006-03-16 Method and apparatus for implementing wash painting style

Publications (2)

Publication Number Publication Date
CN101038675A CN101038675A (en) 2007-09-19
CN100463003C true CN100463003C (en) 2009-02-18

Family

ID=38889558

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006100575998A Active CN100463003C (en) 2006-03-16 2006-03-16 Method and apparatus for implementing wash painting style

Country Status (1)

Country Link
CN (1) CN100463003C (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102087750A (en) * 2010-06-13 2011-06-08 湖南宏梦信息科技有限公司 Method for manufacturing cartoon special effect
CN103685858A (en) * 2012-08-31 2014-03-26 北京三星通信技术研究有限公司 Real-time video processing method and equipment
CN103116898B (en) * 2013-01-30 2016-01-06 深圳深讯和科技有限公司 Generate method and the device of ink and wash style image
CN103218846B (en) * 2013-04-16 2016-06-01 西安理工大学 The ink and wash analogy method of Three-dimension Tree model
CN104463847A (en) * 2014-08-05 2015-03-25 华南理工大学 Ink and wash painting characteristic rendering method
CN104658030B (en) * 2015-02-05 2018-08-10 福建天晴数码有限公司 The method and apparatus of secondary image mixing
CN107123077B (en) * 2017-03-30 2019-01-08 腾讯科技(深圳)有限公司 The rendering method and device of object
CN107045729B (en) * 2017-05-05 2018-09-18 腾讯科技(深圳)有限公司 A kind of image rendering method and device
CN107481200B (en) * 2017-07-31 2018-09-18 腾讯科技(深圳)有限公司 Image processing method and device
CN109816755A (en) * 2019-02-02 2019-05-28 珠海金山网络游戏科技有限公司 A kind of production method of ink image, calculates equipment and storage medium at device
CN109993822B (en) * 2019-04-10 2023-02-21 创新先进技术有限公司 Ink and wash style rendering method and device
CN112070873B (en) * 2020-08-26 2021-08-20 完美世界(北京)软件科技发展有限公司 Model rendering method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1414517A (en) * 2002-09-05 2003-04-30 何云 Manufacturing method of computer wash painting cartoon
US6943805B2 (en) * 2002-06-28 2005-09-13 Microsoft Corporation Systems and methods for providing image rendering using variable rate source sampling
JP2006031561A (en) * 2004-07-20 2006-02-02 Toshiba Corp High-dimensional texture mapping device, method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6943805B2 (en) * 2002-06-28 2005-09-13 Microsoft Corporation Systems and methods for providing image rendering using variable rate source sampling
CN1414517A (en) * 2002-09-05 2003-04-30 何云 Manufacturing method of computer wash painting cartoon
JP2006031561A (en) * 2004-07-20 2006-02-02 Toshiba Corp High-dimensional texture mapping device, method and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
实时绘制3D中国画效果. 张海嵩,尹小勤,于金辉.计算机辅助设计与图形学学报,第16卷第11期. 2004
实时绘制3D中国画效果. 张海嵩,尹小勤,于金辉.计算机辅助设计与图形学学报,第16卷第11期. 2004 *

Also Published As

Publication number Publication date
CN101038675A (en) 2007-09-19

Similar Documents

Publication Publication Date Title
CN100463003C (en) Method and apparatus for implementing wash painting style
CN107045729B (en) A kind of image rendering method and device
US8289342B2 (en) Image processing apparatus and storage medium having stored therein an image processing program
US5680525A (en) Three-dimensional graphic system with an editor for generating a textrue mapping image
CN108537861B (en) Map generation method, device, equipment and storage medium
CN108389257A (en) Threedimensional model is generated from sweep object
CN109045691B (en) Method and device for realizing special effect of special effect object
CN113178014A (en) Scene model rendering method and device, electronic equipment and storage medium
CN1265502A (en) Image processing apparatus and method
CN109711246B (en) Dynamic object recognition method, computer device and readable storage medium
DE102016103854A1 (en) Graphics processing with directional representations of illumination at probe positions within a scene
JP2018205123A (en) Image generation device and image generation method of generating an inspection-purpose image for making performance adjustment of image inspection system
US7327364B2 (en) Method and apparatus for rendering three-dimensional images of objects with hand-drawn appearance in real time
CN112419334A (en) Micro surface material reconstruction method and system based on deep learning
JP3352982B2 (en) Rendering method and device, game device, and computer-readable recording medium for storing program for rendering three-dimensional model
WO2002056253A1 (en) Method for representing color paper mosaic using computer
JPH08161530A (en) Icon preparing method and frame preparing method for dynamic image
EP1374169A2 (en) Application of visual effects to a region of interest within an image
CN103795925A (en) Interactive main-and-auxiliary-picture real-time rendering photographing method
Burgess et al. A system for real-time watercolour rendering
JP3501479B2 (en) Image processing device
Huang et al. A GPU implementation of watercolor painting image generation
JP3157015B2 (en) Image processing method and image processing apparatus
CN115937457B (en) Real-time topography sketch method based on DEM image
Kolker et al. The Ray Tracing based Tool for Generation Artificial Images and Neural Network Training.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant