CN102034262A - Texture filtering method and device based on anisotropy - Google Patents

Texture filtering method and device based on anisotropy Download PDF

Info

Publication number
CN102034262A
CN102034262A CN2009101781248A CN200910178124A CN102034262A CN 102034262 A CN102034262 A CN 102034262A CN 2009101781248 A CN2009101781248 A CN 2009101781248A CN 200910178124 A CN200910178124 A CN 200910178124A CN 102034262 A CN102034262 A CN 102034262A
Authority
CN
China
Prior art keywords
sampling
layer
axis radius
texture
sampling layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2009101781248A
Other languages
Chinese (zh)
Inventor
李康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BYD Co Ltd
Original Assignee
BYD Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BYD Co Ltd filed Critical BYD Co Ltd
Priority to CN2009101781248A priority Critical patent/CN102034262A/en
Publication of CN102034262A publication Critical patent/CN102034262A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a texture filtering method based on anisotropy, comprising the following steps of: carrying out MIP-MAP (Multum In Parvo Map) prefiltering on a texture image; considering an image in a screen space as a circle with a unit pixel as radius, projecting the pixel into the texture space and carrying out approximation by using an ellipse; calculating minor axis radius, macro axis radius and an included angle between the macro axis radius and a u-axis of the ellipse; determining a sampling layer L in a texture lookup table according to the calculated minor axis radius; determining the sample quantity in the sampling layer L and the positions of the macro axis sampling points along the ellipse according to the minor axis radius, the macro axis radius and the included angle between the macro axis radius and the u-axis, and sampling in the sampling layer L; sampling in a sampling layer L+1 along the macro axis of the ellipse according to the sampling quantity of the sampling layer L+1; carrying out linear interpolation on color values obtained from the sampling layer L and the sampling layer L+1 to finally obtain the color value. By adopting 1/4 sampling quantity in the L layer as the sampling quantity in the L+1 layer, the invention avoids oversampling and ensures that the texture can be still kept clear when the texture view angle is far away from the observation points.

Description

Based on anisotropic texture filtering method and apparatus
Technical field
The present invention relates to image processing field, particularly a kind of based on anisotropic texture filtering method and apparatus.
Background technology
In order to quicken texture, reduce the calculation consumption of mapping process, often adopt pre-filtering texture process, the pre-filtered sampled point that makes each pixel only need extract seldom calculates, thereby has improved the calculating effect.
Generally can adopt double integral definition pre-filtering in the prior art, promptly g (x, y)=(x, y) (wherein, f is input picture (being texture image) to h to ∫ ∫ f for x-u, y-v) dudv, and g is an output image, and h is the filtrator kernel.The filtrator of selecting for use at this Tri linear interpolation algorithm is foursquare, and it is a kind of square filtering technology.Existing a kind of typical square filtering technology is MIP-MAP (many texture) pre-filtering.The pre-filtered main thought of MIP-MAP is the texture array that texture image is expressed as different resolution.For example, a given resolution is 512 * 512 texture image, texture space can be divided into 512 * 512 little squares by texture pixel, gets each foursquare texture mean value as first order standard sample, is called 0 layer; Then texture space is divided into 256 * 256 little squares, gets each foursquare texture mean value, filter and form the new images that has only a half-resolution, be called 1 layer as second level standard sample (being the average of 4 neighbor color values of the first order).Continuing this processing procedure on the basis of new images, is 1 * 1 until image resolution ratio, has so just formed pyramidal texture storage structure.In the Tri linear interpolation algorithm, the square that it is the length of side that a pixel of screen space is seen as with a unit picture element, with the pixel projection of screen space to texture space, the projection of shape of going to approach it with a square, and calculate the foursquare length of side, take a sample at pyramidal which L layer according to this length of side decision of calculating.Then, on L and L+1 layer, get four samplings respectively and carry out bilinear interpolation, obtain two color values on the layer, on these two color values, do linear interpolation again, get pixel color value to the end.By foregoing description as can be known, this technology has been done the cubic curve interpolation, therefore is called as Tri linear interpolation or Trilinear Filtering.And the filtrator of selecting for use is foursquare, is a kind of square filtering technology.
Trilinear Filtering is based upon on isotropic square filtering device basis, yet the mapping of pixel is to have anisotropicly, uses isotropic Trilinear Filtering method can make that image thickens.
Therefore, in order to overcome the shortcoming that isotropy is filtered, also proposed based on anisotropic filter method in the prior art.
Existing have Feline (fast ellipse line, quick oval straight line) algorithm based on anisotropic filter method.The thought of Feline algorithm is that to regard the pixel of screen space as one be the center with this pixel, with the unit picture element circle that is radius, project to texture space and be an ellipse, calculate oval minor axis radius and major axis radius, in which layer sampling, use the number of samples of minor axis radius and major axis radius decision with the minor axis radius decision along transverse.
Though, the Feline algorithm uses a plurality of isotropic Trilinear Filterings to approach oval value along transverse, reduced hard-wired cost, but, because the Feline algorithm has used a plurality of anisotropic Trilinear Filterings, though enough sample at the L layer like this but be over-sampling on the L+1 layer, it is certain fuzzy to cause image to occur.Therefore, need a kind of method to address the above problem.
Summary of the invention
Purpose of the present invention is intended to solve at least one of above-mentioned technological deficiency, particularly solves anisotropic Feline algorithm and causes image blurring problem at L+1 layer over-sampling.
In order to achieve the above object, the present invention proposes a kind of based on anisotropic texture filtering method, may further comprise the steps: texture image is carried out many texture MIP-MAP pre-filtering, described texture image is expressed as texture array with as the texture look-up table with different resolution; To texture space, is oval in pixel described in the described texture space with the pixel projection in the screen space; Calculate the angle of minor axis radius, major axis radius, major axis radius and the u axle of described ellipse; Determine sampling layer L in described texture look-up table according to the minor axis radius of described ellipse; Determine in the number of samples of described sampling layer L and along the position of transverse sampling spot according to the angle of minor axis radius, major axis radius and the major axis radius of described ellipse and u axle, and take a sample, get its color value at a described sampling layer L; Number of samples according to the layer L+1 that take a sample is taken a sample at sampling layer L+1 along transverse, gets its color value; The color value that obtains among described sampling layer L and the sampling layer L+1 is done linear interpolation, obtain final color value.
The present invention also proposes a kind of based on anisotropic texture filtering device on the other hand, comprises pre-filtering module, projection module and computing module.Wherein, described pre-filtering module is used for texture image is carried out the MIP-MAP pre-filtering, described texture image is expressed as texture array with different resolution with as the texture look-up table; Described projection module is used for pixel projection with screen space to texture space, is oval in pixel described in the described texture space; Described computing module, be used to calculate the minor axis radius of described ellipse, major axis radius, the angle of major axis radius and u axle, and determine sampling layer L in described texture look-up table according to the minor axis radius of described ellipse, and according to the minor axis radius of described ellipse, the decision of the angle of major axis radius and major axis radius and u axle is determined in the number of samples of described sampling layer L and along the position of transverse sampling spot, and take a sample at a described sampling layer L, must the take a sample color value of layer L, take a sample at sampling layer L+1 along transverse according to the number of samples of sampling layer L+1 again, must the take a sample color value of layer L+1, and the color value that obtains among described sampling layer L and the sampling layer L+1 done linear interpolation, obtain final color value.
The present invention is a number of samples by adopting 1/4 of L layer number of samples at the L+1 layer, has avoided over-sampling, makes still to keep clear when the texture visual angle when observation point is far.
Aspect that the present invention adds and advantage part in the following description provide, and part will become obviously from the following description, or recognize by practice of the present invention.
Description of drawings
Above-mentioned and/or additional aspect of the present invention and advantage are from obviously and easily understanding becoming the description of embodiment below in conjunction with accompanying drawing, wherein:
Fig. 1 is the process flow diagram based on anisotropic texture filtering method of the embodiment of the invention;
Fig. 2 is the synoptic diagram of the image pyramid of the embodiment of the invention;
Fig. 3 is the storage mode synoptic diagram of the image pyramid of the embodiment of the invention;
Fig. 4 is the synoptic diagram of the pixel projection of the embodiment of the invention;
Fig. 5 is that (x y) projects to texture space and gets oval synoptic diagram for the screen space pixel of the embodiment of the invention;
Fig. 6 is ellipse short shaft radius in the texture space of the embodiment of the invention and the major axis radius situation synoptic diagram less than a pixel;
Sampling synoptic diagram when Fig. 7 is even number for the L layer number of samples of the embodiment of the invention;
Sampling synoptic diagram when Fig. 8 is odd number for the L layer number of samples of the embodiment of the invention;
Fig. 9 is the synoptic diagram in L layer and the sampling of L+1 layer of the embodiment of the invention;
Figure 10 is the structural drawing based on anisotropic texture filtering device of the embodiment of the invention;
Figure 11-1 is the pixel projection synoptic diagram of Trilinear Filtering method;
Figure 11-2 is the sampling synoptic diagram of Trilinear Filtering method;
Figure 11-3 is the vein pattern of the embodiment of the invention;
Figure 11-4 is the projection result synoptic diagram of Trilinear Filtering method;
Figure 11-5 is for partly amplifying the result schematic diagram after 3 times corresponding to the dark border among Figure 11-4;
Figure 12-1 is the sampling synoptic diagram of Feline algorithm;
Figure 12-2 is the projection result synoptic diagram of Feline algorithm;
Figure 12-3 is for partly amplifying the result schematic diagram after 3 times corresponding to the dark border among Figure 12-2;
Figure 13-1 is the projection result synoptic diagram of the filter method of the embodiment of the invention;
Figure 13-2 is for partly amplifying the result schematic diagram after 3 times corresponding to the dark border among Figure 13-1.
Embodiment
Describe embodiments of the invention below in detail, the example of described embodiment is shown in the drawings, and wherein identical from start to finish or similar label is represented identical or similar elements or the element with identical or similar functions.Below by the embodiment that is described with reference to the drawings is exemplary, only is used to explain the present invention, and can not be interpreted as limitation of the present invention.
As shown in Figure 1, the process flow diagram based on anisotropic texture filtering method for the embodiment of the invention may further comprise the steps:
Step S101 carries out many texture MIP-MAP pre-filtering to texture image, with texture image be expressed as have a plurality of different resolutions the texture array as the texture look-up table, form the image pyramid that resolution is successively decreased step by step.
In an embodiment of the present invention, each grade image resolution ratio all is taken as half of high one-level image resolution ratio.Be that 256 * 256 original texture image is that example describes with resolution below.At first, texture space is divided into 256 * 256 little squares according to texture pixel, gets each foursquare texture mean value, be called 0 layer as first order standard sample data.Then, the resolution that continues to get the second layer is half of ground floor, is about to texture space and is divided into 128 * 128 little squares, gets each foursquare texture mean value as first order standard sample data, is called 1 layer.By that analogy, be 1 * 1 until image resolution ratio, so just formed the image pyramid that resolution is as shown in Figure 2 successively decreased step by step.
After the texture image process MIP-MAP pre-filtering, the set of image will be stored in the MIP-MAP table.Be illustrated in figure 3 as the storage mode of the image pyramid of the embodiment of the invention, the MIP-MAP table that comprises red (R), green (G), blue (B) three components is stored in 512 * 512 the memory block.As seen, the memory size that needs of the MIP-MAP of texture image table is 4/3 times of the shared internal memory of texture image.
Should be understood that the foregoing description only is schematic embodiment, does not limit the scope of the invention.
Step S102, with the screen space pixel projection to texture space.
As shown in Figure 4, be the synoptic diagram of the pixel projection of the embodiment of the invention.In the drawings, it is the center of circle that the pixel of screen space is seen as with this pixel, and with the circle that a unit picture element is a radius, like this, the zone of neighbor mutually alternately.The pixel projection of screen space behind texture space, is seen as an ellipse, and texture space oval mutually alternately.The purpose of doing like this is and can enough samples at texture space, the generation of opposing sawtooth.
Step S103 calculates oval minor axis radius, major axis radius and major axis radius and the angle of u axle.
As shown in Figure 5, (x y) projects to texture space and gets oval synoptic diagram for the screen space pixel of the embodiment of the invention.In texture space, vector
Figure B2009101781248D0000051
Vector If with F represent from screen coordinate (x, y) to texture coordinate (u, transforming function transformation function v), then u=Fu (x, y), v=Fv (x, y), therefore, r1 and r2 also can be expressed as:
r1=(Fu(x+1,y)-Fu(x,y),Fv(x+1,y)-Fv(x,y))
r2=(Fu(x,y+1)-Fu(x,y),Fv(x,y+1)-Fv(x,y))。
Any one center of circle in the general expression formula of the ellipse of initial point is
Ann*x 2+ Bnn*x*y+Cnn*y 2=F, perhaps
Figure B2009101781248D0000053
Or A*x 2+ B*x*y+C*y 2=1.
According to vectorial r1 and r2, can adopt following formula to calculate minor axis radius minorRadius and major axis radius majorRadius oval in texture space.
Ann = ( ∂ v ∂ x ) 2 + ( ∂ v ∂ y ) 2 ;
Bnn = - 2 * ( ∂ u ∂ x * ∂ v ∂ x + ∂ u ∂ y * ∂ v ∂ y ) ;
Cnn = ( ∂ u ∂ x ) 2 + ( ∂ u ∂ y ) 2 ;
F=Ann*Cnn-Bnn 2/4;
A=Ann/F;
B=Bnn/F;
C=Cnn/F;
root=sqrt((A-C) 2+B 2);
A′=(A+C-root)/2;
C′=(A+C+root)/2;
majorRadius=sqrt(1/A′);
minorRadius=sqrt(1/C′);
In an embodiment of the present invention, the minor axis radius of the ellipse of texture space and major axis radius all might be less than 1, as shown in Figure 6, are the ellipse short shaft radius in the texture space of the embodiment of the invention and the major axis radius situation synoptic diagram less than a pixel.In embodiments of the present invention, the minor axis radius of regulation ellipse and major axis radius all are not less than 1, therefore:
minorRadius=max(minorRadius,1),
majorRadius=max(majorRadius,1)。
According to the aforementioned calculation result, can calculate the angle theta of transverse and u axle:
theta=arctan(B/(A-C))/2。
If A>C, promptly theta is the angle of ellipse short shaft and u axle, and then the angle of transverse and u axle is: the theta=theta+ pi/2.
Step S104 determines sampling layer L in the texture look-up table according to the minor axis radius of ellipse.
In an embodiment of the present invention, the determining function of employing is level=log 2(min or Radius).That is, if minorRadius be 1 o'clock 0 layer of sampling, if minorRadius be 2 o'clock 1 layer of sampling, by that analogy.But because minorRadius can not just be 2 n, therefore working as minorRadius is not 2 nThe time, level is a floating number, at this moment, exists respectively
Figure B2009101781248D0000071
The layer (being the L layer) and
Figure B2009101781248D0000072
Layer (being the L+1 layer) sampling, wherein,
Figure B2009101781248D0000073
For rounding operation downwards, Be the operation that rounds up.
Should be understood that the foregoing description only is schematic example, does not limit the scope of the invention.
Step S105 takes a sample at the L layer.
For how determining that number of samples reaches the position along the transverse sampling spot, the possible computing method that the present invention proposes are as follows, certainly those skilled in the art can also propose other modifications or variation according to following method, and these modifications or variation all should be included in comprising within the scope of invention.
At first, according to the minor axis radius of ellipse and major axis radius decision number of samples at the L layer:
Number=2*(majorRadius/minorRadius)-1。
Number is generally floating number, it is rounded up as number of samples, to prevent owing sampling at the L layer.
Then, calculate the length of sampling straight line, promptly first sampling spot arrives the distance of a last sampling spot:
lineLength=2*(majorRadius-minorRadius),
Then the projector distance of two adjacent samples points on the u axle is:
Δ u=cos (theta) * lineLength/ (Number-1), the projector distance of two adjacent samples points on the v axle is: Δ v=sin (theta) * lineLength/ (Number-1).If Number=1 then only gets this point of the oval center of circle at the L layer; Otherwise the coordinate of supposing the oval center of circle is (u Mid, v Mid), then sample point coordinates (u v) is:
(u,v)=(u mid,v mid)+n/2*(Δu,Δv)。
When Number was even number, the value of n was ± 1, ± 3......, and the gained sampling spot is as shown in Figure 7; When Number was odd number, the value of n was 0, ± 2, ± 4...., and the gained sampling spot is as shown in Figure 8.
Generally speaking, the coordinate of sampling spot all is a floating number, can obtain the color value of each sampling spot by bilinear interpolation, asks its mean value as the final color value in the sampling of L layer then.
Step S106 takes a sample at the L+1 layer.
In an embodiment of the present invention, according to the pre-filtered character of MIP-MAP, the resolution of L+1 layer is 1/2 of L layer resolution, and a pixel of L+1 layer is to be asked on average by four pixels of L layer to obtain, and is equivalent to adopt four samples at the L layer so adopt a sample at the L+1 layer.Therefore, for fear of at L+1 layer over-sampling, in embodiments of the present invention, the number of samples that is defined in the L+1 layer is 1/4 of a L layer, promptly
Figure B2009101781248D0000081
In addition, with identical, obtain final color value at the sampling mode of L+1 layer in the sampling of L+1 layer at the sampling mode of L layer.
Step S107, the color value that L layer and L+1 layer are obtained do linear interpolation and get final color value.
Particularly, ask mean value respectively in L layer and L+1 layer sampling acquisition, obtain two color values on the layer, and the color value on these two layers is done linear interpolation, as shown in Figure 9, sampling synoptic diagram for the embodiment of the invention at L layer and L+1 layer, the final color value that obtains is Color=CL* (1-f)+CL1*f, wherein, and the color value of CL for obtaining at the L layer, the color value of CL1 for obtaining at the L+1 layer
Figure B2009101781248D0000082
Be linear interpolation factor.
At the foregoing description, the present invention also proposes a kind of based on anisotropic texture filtering device.As shown in figure 10, be the structural drawing based on anisotropic texture filtering device of the embodiment of the invention, this device comprises pre-filtering module 100, projection module 200 and computing module 300.Pre-filtering module 100 is used for texture image is carried out the MIP-MAP pre-filtering, texture image is expressed as texture array with different resolution with as the texture look-up table.Projection module 200 is used for pixel projection with screen space to texture space, is oval in institute's texture space.Computing module 300 is used to calculate oval minor axis radius, major axis radius, the angle of major axis radius and u axle, and determine sampling layer L in the texture look-up table according to the minor axis radius of the ellipse that calculates, and according to the minor axis radius of ellipse, the angle of major axis radius and major axis radius and u axle is determined in the number of samples of sampling layer L and along the position of transverse sampling spot, and take a sample at a sampling layer L, must the take a sample color value of layer L, take a sample at sampling layer L+1 along transverse according to the number of samples of sampling layer L+1 again, must the take a sample color value of layer L+1, and the color value that obtains among will take a sample layer L and the sampling layer L+1 does linear interpolation, obtains final color value.
Below the texture filtering method of the embodiment of the invention and Trilinear Filtering method and Feline algorithm are compared, by following comparison, of the present invention and/or additional aspect and advantage will become and more obviously and easily understand.
Existing Trilinear Filtering method is based on isotropic.Shown in Figure 11-1, be the pixel projection synoptic diagram of Trilinear Filtering method.In the drawings, the pixel of screen space is seen as the square of a unit picture element.The pixel projection of screen space behind texture space, is approached projection of shape with square.Can determine the sampling layer according to the foursquare length of side.Shown in Figure 11-2, be the sampling synoptic diagram of Trilinear Filtering method.
Adopt the Trilinear Filtering method to project into from viewport the vein pattern shown in Figure 11-3 far, result after the projection is shown in Figure 11-4, and wherein, dark border partly is from the viewport distal-most end, shown in Figure 11-5, for the dark border among Figure 11-4 partly being amplified the result schematic diagram after 3 times.
Existing Feline algorithm is based on anisotropic.Identical with the filter method of present embodiment, it is the center of circle that the pixel of screen space is seen as with this pixel, is the circle of radius with the unit picture element.The pixel projection of screen space behind texture space, is seen as an ellipse.Calculate the angle of ellipse short shaft radius, major axis radius and major axis radius and u axle, determine that sampling layer L reaches the sampling rate at the L layer, makes the L+1 layer identical with the number of samples of L layer.Shown in Figure 12-1, the sampling synoptic diagram for existing Feline algorithm carries out repeatedly Trilinear Filtering according to sampling rate along transverse, a plurality of values that obtain is carried out Gauss's weighting obtain final color value.
Yet the resolution of L+1 layer is half of L layer resolution, and a pixel of L+1 layer is to be asked on average by four pixels of L layer to obtain, and therefore, adopts a sample at the L+1 layer and is equivalent to adopt four samples at the L layer, and therefore existing Feline algorithm causes over-sampling at the L+1 layer.
The same vein pattern projection visual angle the same that will be shown in Figure 11-3 with the Trilinear Filtering method, result after the projection is shown in Figure 12-2, and wherein, dark border partly is from the viewport distal-most end, shown in Figure 12-3, for the dark border among Figure 12-2 partly being amplified the result schematic diagram after 3 times.
Shortcoming according to above-mentioned existing Feline algorithm, the embodiment of the invention based on anisotropic texture filtering method improvement the sampling mode of existing Feline algorithm, still take a sample according to original sampling rate at the L layer, but make that the sampling rate of L+1 layer is 1/4 of a L layer, avoid over-sampling.
Same vein pattern projection and existing Trilinear Filtering method shown in Figure 11-3 are reached the existing the same visual angle of Feline algorithm, result after the projection is shown in Figure 13-1, wherein, dark border partly is from the viewport distal-most end, shown in Figure 13-2, for the dark border among Figure 13-1 partly being amplified the result schematic diagram after 3 times.
Comparison diagram 11-5, the result of Figure 12-3 and Figure 13-2, as can be seen, based on effective than based on isotropic Trilinear Filtering method of anisotropic Feline algorithm, the filter method of the embodiment of the invention is better than Feline algorithm effects, this is because the filter algorithm of the embodiment of the invention has been improved the defective of Feline algorithm at L+1 layer over-sampling, has therefore obtained better effect.
The present invention is a number of samples by adopting 1/4 of L layer number of samples at the L+1 layer, has avoided over-sampling, makes still to keep clear when the texture visual angle when observation point is far.
Although illustrated and described embodiments of the invention, for the ordinary skill in the art, be appreciated that without departing from the principles and spirit of the present invention and can carry out multiple variation, modification, replacement and modification that scope of the present invention is by claims and be equal to and limit to these embodiment.

Claims (14)

1. one kind based on anisotropic texture filtering algorithm, it is characterized in that, may further comprise the steps:
Texture image is carried out many texture MIP-MAP pre-filtering, described texture image is expressed as texture array with as the texture look-up table with different resolution;
With the circle that to regard as with a unit picture element be radius of the pixel in the screen space, to texture space, projection is shaped as ellipse in described texture space with described pixel projection;
Calculate the angle of minor axis radius, major axis radius, major axis radius and the u axle of described ellipse;
Determine sampling layer L in described texture look-up table according to the minor axis radius of described ellipse;
Determine in the number of samples of described sampling layer L and along the position of transverse sampling spot according to the angle of minor axis radius, major axis radius and the major axis radius of described ellipse and u axle, and take a sample, get its color value at a described sampling layer L;
Number of samples according to the layer L+1 that take a sample is taken a sample at sampling layer L+1 along transverse, gets its color value;
The color value that obtains among described sampling layer L and the sampling layer L+1 is done linear interpolation, obtain final color value.
2. as claimed in claim 1ly it is characterized in that based on anisotropic texture filtering method described minor axis radius according to described ellipse is determined the sampling layer L in described texture look-up table, comprising:
If level=log 2(minorRadius) be integer, then in the sampling of level layer, wherein minorRadius is a minor axis radius;
If level=log 2(minorRadius) be floating number, then exist respectively The layer and
Figure F2009101781248C0000012
Sampling, wherein minorRadius is a minor axis radius,
Figure F2009101781248C0000013
For rounding operation downwards,
Figure F2009101781248C0000014
Be the operation that rounds up.
3. as claimed in claim 1ly it is characterized in that the angle of described minor axis radius according to described ellipse, major axis radius and major axis radius and u axle is determined to comprise in the number of samples of described sampling layer L based on anisotropic texture filtering method:
The number of samples of described sampling layer L is Number=2* (majorRadius/minorRadius)-1, if Number is a floating number, then rounds up, and wherein majorRadius is a major axis radius, and minorRadius is a minor axis radius.
4. as claimed in claim 1ly it is characterized in that based on anisotropic texture filtering method the angle of described minor axis radius according to described ellipse, major axis radius and major axis radius and u axle determines that the position of described sampling spot comprises:
The sample point coordinates of described sampling layer L be (u, v)=(u Mid, v Mid)+n/2* (Δ u, Δ v), wherein, (u Mid, v Mid) be the coordinate in the described oval center of circle, (Δ u, Δ v) are the coordinate axis projector distance of adjacent two sampling spots,
Figure F2009101781248C0000021
MajorRadius is a major axis radius, and minorRadius is a minor axis radius.
5. as claimed in claim 4 based on anisotropic texture filtering method, it is characterized in that, described Δ u=cos (theta) * lineLength/ (Number-1), Δ v=sin (theta) * lineLength/ (Number-1), wherein, lineLength=2* (majorRadius-minorRadius) is the length of sampling straight line, be the distance of first sampling spot to a last sampling spot, theta is the angle of major axis radius and u axle.
6. as claimed in claim 1ly it is characterized in that wherein, the number of samples of described sampling layer L+1 is 1/4 of a described sampling layer L number of samples, promptly based on anisotropic texture filtering method
Figure F2009101781248C0000022
Wherein Number ' is the number of samples of sampling layer L+1, and Number is the number of samples of sampling layer L,
Figure F2009101781248C0000023
Be the operation that rounds up.
7. as claimed in claim 1ly it is characterized in that based on anisotropic texture filtering method described final color value is Color=CL* (1-f)+CL1*f, wherein, the color value of CL for obtaining at described sampling layer L, the color value of CL1 for obtaining at described sampling layer L+1
Figure F2009101781248C0000024
Be linear interpolation factor.
8. one kind based on anisotropic texture filtering device, it is characterized in that, comprises pre-filtering module, projection module and computing module,
Described pre-filtering module is used for texture image is carried out the MIP-MAP pre-filtering, described texture image is expressed as texture array with different resolution with as the texture look-up table;
Described projection module is used for the pixel of the screen space circle that to regard as with a unit picture element be radius, and to texture space, projection is shaped as ellipse in described texture space with described pixel projection;
Described computing module, be used to calculate the minor axis radius of described ellipse, major axis radius, the angle of major axis radius and u axle, and determine sampling layer L in described texture look-up table according to the minor axis radius of described ellipse, and according to the minor axis radius of described ellipse, the decision of the angle of major axis radius and major axis radius and u axle is determined in the number of samples of described sampling layer L and along the position of transverse sampling spot, and take a sample at a described sampling layer L, must the take a sample color value of layer L, take a sample at sampling layer L+1 along transverse according to the number of samples of sampling layer L+1 again, must the take a sample color value of layer L+1, and the color value that obtains among described sampling layer L and the sampling layer L+1 done linear interpolation, obtain final color value.
9. as claimed in claim 8ly it is characterized in that described computing module is determined sampling layer L in described texture look-up table according to following formula based on anisotropic texture filtering device:
If level=log 2(minorRadius) be integer, then in the sampling of level layer, wherein minorRadius is a minor axis radius;
If level=log 2(minorRadius) be floating number, then exist respectively
Figure F2009101781248C0000031
The layer and
Figure F2009101781248C0000032
Sampling, wherein minorRadius is a minor axis radius,
Figure F2009101781248C0000033
For rounding operation downwards,
Figure F2009101781248C0000034
Be the operation that rounds up.
10. as claimed in claim 8ly it is characterized in that described computing module is determined number of samples at described sampling layer L according to following formula based on anisotropic texture filtering device:
The number of samples of described sampling layer L is Number=2* (majorRadius/minorRadius)-1, if Number is a floating number, then rounds up, and wherein majorRadius is a major axis radius, and minorRadius is a minor axis radius.
11. as claimed in claim 8ly it is characterized in that based on anisotropic texture filtering device described computing module is determined the position of described sampling spot according to following formula:
The sample point coordinates of described sampling layer L be (u, v)=(u Mid, v Mid)+n/2* (Δ u, Δ v), wherein, (u Mid, v Mid) be the coordinate in the described oval center of circle, (Δ u, Δ v) are the coordinate axis projector distance of adjacent two sampling spots,
Figure F2009101781248C0000035
MajorRadius is a major axis radius, and minorRadius is a minor axis radius.
12. it is as claimed in claim 11 based on anisotropic texture filtering device, it is characterized in that, described Δ u=cos (theta) * lineLength/ (Number-1), Δ v=sin (theta) * lineLength/ (Number-1), wherein, lineLength=2* (majorRadius-minorRadius) is the sampling straight length, be the distance of first sampling spot to a last sampling spot, theta is the angle of major axis radius and u axle.
13. as claimed in claim 8ly it is characterized in that wherein, the number of samples of described sampling layer L+1 is 1/4 of a described sampling layer L number of samples, promptly based on anisotropic texture filtering device
Figure F2009101781248C0000036
Wherein Number ' is the number of samples of sampling layer L+1, and Number is the number of samples of sampling layer L, Be the operation that rounds up.
14. as claimed in claim 8ly it is characterized in that described computing module calculates final color value according to following formula based on anisotropic texture filtering device:
Final color value Color=CL* (1-f)+CL1*f, wherein, the color value of CL for obtaining at described sampling layer L, the color value of CL1 for obtaining at described sampling layer L+1,
Figure F2009101781248C0000041
Be linear interpolation factor.
CN2009101781248A 2009-09-27 2009-09-27 Texture filtering method and device based on anisotropy Pending CN102034262A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009101781248A CN102034262A (en) 2009-09-27 2009-09-27 Texture filtering method and device based on anisotropy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009101781248A CN102034262A (en) 2009-09-27 2009-09-27 Texture filtering method and device based on anisotropy

Publications (1)

Publication Number Publication Date
CN102034262A true CN102034262A (en) 2011-04-27

Family

ID=43887114

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009101781248A Pending CN102034262A (en) 2009-09-27 2009-09-27 Texture filtering method and device based on anisotropy

Country Status (1)

Country Link
CN (1) CN102034262A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112686983A (en) * 2019-10-17 2021-04-20 畅想科技有限公司 Texture filtering
CN113065091A (en) * 2021-04-12 2021-07-02 中国地质科学院地质力学研究所 Method and device for analyzing anisotropic distribution rule of geological information and electronic equipment
GB2603560A (en) * 2021-01-06 2022-08-10 Advanced Risc Mach Ltd Graphics texture mapping

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112686983A (en) * 2019-10-17 2021-04-20 畅想科技有限公司 Texture filtering
CN112686983B (en) * 2019-10-17 2024-06-14 畅想科技有限公司 Method and apparatus for performing texture filtering
GB2603560A (en) * 2021-01-06 2022-08-10 Advanced Risc Mach Ltd Graphics texture mapping
GB2610373A (en) * 2021-01-06 2023-03-08 Advanced Risc Mach Ltd Graphics texture mapping
US11610359B2 (en) 2021-01-06 2023-03-21 Arm Limited Graphics texture mapping
US11625887B2 (en) 2021-01-06 2023-04-11 Arm Limited Graphics texture mapping
US11645807B2 (en) 2021-01-06 2023-05-09 Arm Limited Graphics texture mapping
GB2603560B (en) * 2021-01-06 2023-08-16 Advanced Risc Mach Ltd Graphics texture mapping
GB2610373B (en) * 2021-01-06 2023-09-06 Advanced Risc Mach Ltd Graphics texture mapping
CN113065091A (en) * 2021-04-12 2021-07-02 中国地质科学院地质力学研究所 Method and device for analyzing anisotropic distribution rule of geological information and electronic equipment

Similar Documents

Publication Publication Date Title
CN111080724B (en) Fusion method of infrared light and visible light
US8233745B2 (en) Image processing apparatus and image processing method
US8861895B2 (en) Image processing apparatus
Glasner et al. Super-resolution from a single image
US9639918B2 (en) Method for anti-aliasing of image with super-resolution
CN100405805C (en) Image processing device and method, image projection apparatus, and program
WO2009130820A1 (en) Image processing device, display, image processing method, program, and recording medium
CN103034973B (en) Based on the adaptive image scaling method of bicubic interpolation
Su et al. Neighborhood issue in single-frame image super-resolution
WO2015032185A1 (en) Image super-resolution reconfiguration system and method
US11244426B2 (en) Method for image super resolution imitating optical zoom implemented on a resource-constrained mobile device, and a mobile device implementing the same
CN106447020B (en) A kind of intelligence method for counting colonies
KR101934261B1 (en) Method and device for converting image resolution, and electronic device having the device
CN108109109B (en) Super-resolution image reconstruction method, device, medium and computing equipment
CN102855649A (en) Method for splicing high-definition image panorama of high-pressure rod tower on basis of ORB (Object Request Broker) feature point
CN101163224A (en) Super-resolution device and method
CN105339951A (en) Method for detecting a document boundary
CN106204441B (en) Image local amplification method and device
WO2010095460A1 (en) Image processing system, image processing method, and image processing program
US11854157B2 (en) Edge-aware upscaling for improved screen content quality
CN102034262A (en) Texture filtering method and device based on anisotropy
WO2016005242A1 (en) Method and apparatus for up-scaling an image
CN114820581B (en) Axisymmetric optical imaging parallel simulation method and device
CN104063875B (en) Strengthen video image smoothness and the ultra-resolution ratio reconstructing method of definition
JP2009100407A (en) Image processing apparatus and method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110427