CN100489892C - Method and device for restoring image texture description sign for describing image texture characteristics - Google Patents
Method and device for restoring image texture description sign for describing image texture characteristics Download PDFInfo
- Publication number
- CN100489892C CN100489892C CNB008042845A CN00804284A CN100489892C CN 100489892 C CN100489892 C CN 100489892C CN B008042845 A CNB008042845 A CN B008042845A CN 00804284 A CN00804284 A CN 00804284A CN 100489892 C CN100489892 C CN 100489892C
- Authority
- CN
- China
- Prior art keywords
- designator
- image
- scale
- systematicness
- curve map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5862—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
- G06T7/42—Analysis of texture based on statistical description of texture using transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/19—Recognition using electronic means
- G06V30/196—Recognition using electronic means using sequential comparisons of the image signals with a plurality of references
- G06V30/1983—Syntactic or structural pattern recognition, e.g. symbolic string recognition
- G06V30/1988—Graph matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20068—Projection on vertical or horizontal image axis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20072—Graph-based image processing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Probability & Statistics with Applications (AREA)
- Multimedia (AREA)
- Library & Information Science (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
A method for retrieving an image texture descriptor for describing texture features of an image, including the steps of (a) filtering input images using predetermined filters having different orientation coefficients, (b) projecting the filtered images onto axes of each predetermined direction to obtain data groups consisting of averages of each directional pixel values, (c) selecting candidate data groups among the data groups by a predetermined classification method, (d) determining a plurality of indicators based on orientation coefficients of the filters used in filtering the candidate data groups, and (e) determining the plurality of indicators as the texture descriptor of the image. The texture descriptors which allow kinds of texture structure present in an image to be perceptually captures can be retrieved.
Description
Technical field
The present invention relates to a kind of method and apparatus of describing texture of image symbol, particularly relate to a kind of method and device thereof that is used to recover describe the describing texture of image symbol of image texture features, the texture features that this texture description symbol is used to search for and browse image and describes image.
Background technology
Recently, the figure image texture occurs as searching for and browse a large amount of similarly important visual signatures of image patterns.For example, be used for extracting and comprise the texture description symbol that filters the coefficient that obtains by Gabor by the texture description symbol that the Gabor wave filter filters the routine of texture description symbol.Yet,, accord with that visually the perception texture structure is still very difficult from texture description although conventional visual texture description symbol comprises a large amount of vectors.
Summary of the invention
The object of the present invention is to provide a kind of method that is used for recovering catching at the visual texture description symbol of the texture structure of image perceptually.
Another object of the present invention is to provide a kind of computer-readable recording medium that wherein has computer program, and this program is arranged so that computing machine execution graph image texture descriptor restoration methods.
Another purpose more of the present invention is to provide a kind of visual texture description symbol recovery device of execution graph image texture descriptor restoration methods.
For achieving the above object, a kind of method that is used to recover describe the describing texture of image symbol of image texture features is provided, comprising: the systematicness designator that (a) produces the indicating image systematicness; (b) the direction designator of generation indicating image direction; (c) the scale designator of the texel scale of generation indicating image; (d) service regeulations designator, direction designator and scale designator, the texture description symbol of presentation video.
For achieving the above object, a kind of device that is used to recover describe the describing texture of image symbol of image texture features is provided, comprise: generation unit, the scale designator of the texel scale of the systematicness designator of generation indicating image systematicness, the direction designator of indicating image direction and indicating image; With expression unit, service regeulations designator, direction designator and scale designator, the texture description symbol of presentation video.
For achieving the above object, a kind of method that is used to recover to be used to describe the visual texture description symbol of visual textural characteristics is provided, comprise step: (a) use predetermined filters to filter input imagery with different orientation coefficient, (b) the filtered image of projection is gone up to obtain the data set that comprises each direction pixel value to the axle of each predetermined direction, (c) in data set, select the candidate data group by the predtermined category method, (d) based on the orientation coefficient of the wave filter that is used to filter the candidate data group, determine a plurality of designators and (e) determine that a plurality of designators are as the texture description of image symbol.
Step (a) may further include step: (a-1) use the predetermined filters with different scales (scale) coefficient to filter input imagery, and step (d) further comprises step: (d-1) determine a plurality of designators based on the scaling ratio of the wave filter that is used to filter the candidate data group.
Image texture description symbol restoration methods may further include step: another designator is determined in the existence based on the data set that filters by the wave filter with scaling ratio or orientation coefficient, and this scaling ratio or orientation coefficient are approaching or identical with the scaling ratio and the orientation coefficient of the wave filter that is used to filter selected candidate data group.
Image texture description symbol restoration methods may further include step: about the mean value and the deviate of filtered visual calculating pixel with use the mean value and the deviate that calculate to recover predetermined vector.
According to another aspect of the present invention, a kind of method that is used to recover to be used to describe the visual texture description symbol of visual textural characteristics is provided, comprise step: (a) use predetermined filters to filter input imagery with different scaling ratios, (b) the filtered image of projection is gone up to obtain the average data set that comprises each direction pixel value to the axle of each predetermined direction, (c) determine a plurality of designators based on the scaling ratio of the wave filter that is used for filtering the data set of selecting at data set by predetermined system of selection, (d) determine the texture description symbol of a plurality of designators as image.
According to another aspect more of the present invention, be provided for recovering being used to describing the method for visual texture description symbol of the textural characteristics of image, comprise step: (a) use predetermined wave filter to filter input imagery with different orientation coefficient and different scaling ratios, (b) the filtered image of projection is gone up to obtain figure of transverse axis drop shadow curve and the figure of Z-axis drop shadow curve to level and vertical axle, (c) calculate the normalized autocorrelation value of each curve map, (d) obtain the local maximum and the local minimum of each normalized autocorrelation value, the normalized autocorrelation value that calculates forms local peaking and local valley at predetermined portions, (e) mean value of definition local maximum and the mean value of local minimum are spent as a comparison, (f) selecting wherein, the ratio of the mean value of standard deviation and local maximum is less than or equal to the curve map of predetermined threshold as first candidate's curve map, (g) determine the type of second candidate's curve map according to the quantity of the curve map that filters by wave filter with scaling ratio or orientation coefficient, this scaling ratio or orientation coefficient near or be same as the scaling ratio or the orientation coefficient of the wave filter that is used to filter second selected candidate's curve map, (h) calculate the type separately belong to second candidate's curve map curve map quantity and determine the predefined weight of each type of second candidate's curve map, (i) calculate institute's number curve map quantity that goes out and the weight of determining product and number, to determine that the result of calculation value is as first designator that constitutes the texture description symbol, (j) determine to have the orientation coefficient of second candidate's curve map of maximum-contrast and scaling ratio and as second to five designator and (k) determine to comprise the texture description symbol of the designator of first designator and second to five designator as corresponding image.
Image texture description symbol restoration methods may further include step: about the mean value and the deviate of filtered visual calculating pixel, the acquisition predetermined vector of mean value that use calculates and deviate, wherein step (k) comprises step: determine to comprise first designator, the designator of second to five designator and predetermined vector are as the texture description symbol of corresponding image.
NAC (k) represents normalized autocorrelation, can optimally calculate by following formula:
Wherein N is the positive integer of being scheduled to, and an input imagery comprises the NxN pixel, and pixel location represented by i, and wherein i is 1 to N number, and the figure of drop shadow curve that represents by the pixel of pixel location i is by P (i) expression, k be one from 1 to N number.
Contrast is defined as:
Wherein P_magn (i) and V_magn (i) are local maximum and the local minimums of determining in step (d).
In step (f), the curve map that satisfies following formula is selected as first candidate's curve map:
Wherein d and S are the mean value and the standard deviations of local maximum, and α is a predetermined threshold.
Step (g) comprises substep: if (g-1) one or more curve maps with the scale identical with the scale of relevant candidate's curve map or orientation coefficient or orientation coefficient and one or morely have the scale close with the scale of relevant candidate's curve maps or orientation coefficient or a curve map of orientation coefficient are arranged, relevant candidate's curve map is categorized as first kind curve map, (g-2) if one or more the have scale identical with the scale of relevant candidate's curve map or orientation coefficient or the curve maps of orientation coefficient are arranged, but do not have the scale close or the curve map of orientation coefficient with the scale of relevant candidate's curve map or orientation coefficient, relevant candidate's curve map is categorized as the second type curve map, if (g-3) do not have the scale identical or close or the curve map of orientation coefficient, relevant candidate's curve map be categorized as the 3rd type curve map with the scale of relevant candidate's curve map or orientation coefficient.
Step (h) comprises that numeration belongs to the numeral of curve map of each type of first to the three type curve map, and determines the predefined weight of each curve map type.
After the step (f), may further include step: use pre-defined algorithm to first candidate's curve map to select second candidate's curve map.
The cohesion that pre-defined algorithm is preferably revised.
Be preferably in the step (j), in the figure of transverse axis drop shadow curve, have the orientation coefficient of the curve map of maximum-contrast, be confirmed as second designator; In the figure of Z-axis drop shadow curve, have the orientation coefficient of the curve map of maximum-contrast, be confirmed as second designator; In the figure of transverse axis drop shadow curve, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 4th designator; In the figure of Z-axis drop shadow curve, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 5th designator.
Step (j) can comprise the designator of determining to comprise first designator, second to the 5th designator and the predetermined vector step as the texture description symbol of corresponding image.
Predetermined filters preferably includes the Gabor wave filter.
For reaching second purpose of the present invention, provide a kind of computer readable medium with the program code that can carry out by computing machine, the method that is used to describe the visual texture description symbol of visual textural characteristics with execution, the method comprising the steps of: (a) use the predetermined filters with different orientation coefficient and different scaling ratios to filter input imagery, (b) the filtered image of projection is to level and Z-axis, to obtain figure of transverse axis drop shadow curve and the figure of Z-axis drop shadow curve, (c) calculate the normalized autocorrelation value for each curve map, (d) obtain local maximum and local minimum for each normalized autocorrelation value, the normalized autocorrelation value that calculates forms a local peaking and valley at predetermined portions, (e) mean value of definition local maximum and the mean value of local minimum are spent as a comparison, (f) selecting wherein, the ratio of the mean value of standard deviation and local maximum is less than or equal to the curve map of predetermined threshold as first candidate's curve map, (g) determine the type of second candidate's curve map according to the quantity of the curve map that filters by wave filter with scaling ratio or orientation coefficient, this scaling ratio or orientation coefficient near or be same as the scaling ratio or the orientation coefficient of the wave filter that is used to filter second selected candidate's curve map, (h) according to the quantity of the calculated curve of the type separately figure of second candidate's curve map with determine the predefined weight of each type of second candidate's curve map, (i) calculate institute's number curve map quantity that goes out and the weight of determining product and number, to determine that the result of product value is as first designator that constitutes a texture description symbol, (j) determine to have the orientation coefficient of second candidate's curve map of maximum-contrast and scaling ratio as second to five designator, and the designator of (k) determining to comprise first designator and second to five designator accords with as the texture description of relevant image.
For reaching the 3rd purpose of the present invention, a kind of installation method that is used to recover describe the visual texture description symbol of visual textural characteristics is provided, this device comprises filtration unit, is used to use the predetermined filters with different orientation coefficient to filter input imagery; Projection arrangement is used for the filtered image of projection and goes up the data set that comprises the mean value of each direction pixel value with acquisition to the axle of each predetermined direction; Sorter is used for selecting the candidate data group at data set by the predetermined classification method; First designator is determined device, be used for determining another designator based on the number of the curve map that filters by wave filter with scaling ratio or orientation coefficient, this scaling ratio or orientation coefficient near or be equal to the scaling ratio or the orientation coefficient of the wave filter that is used to filter selected candidate's curve map; Determine device with second designator, be used for determining a plurality of designators based on the scaling ratio and the orientation coefficient of the wave filter that is used to filter definite candidate's curve map.
A ground then provides a kind of device that is used to recover describe the visual texture description symbol of visual textural characteristics, and this device comprises a filter element, is used to use the predetermined wave filter with different orientation coefficient and different scaling ratios to filter input imagery; An image averaging value/deviate computing unit is used to calculate mean value and deviate about the pixel of each filtered image, and uses the mean value and the deviate that calculate to obtain predetermined vector; A projecting cell is used for the filtered image of projection on level and Z-axis, to obtain figure of transverse axis drop shadow curve and the figure of Z-axis drop shadow curve; A computing unit is for each curve map calculates the normalized autocorrelation value; A peak value detection/analytic unit, for each autocorrelation value detects local maximum and local minimum, the normalized autocorrelation value that calculates forms local peaking and local valley in predetermined part; A mean value/deviate computing unit is used to calculate the mean value of local maximum and the mean value of local minimum; One first candidate's curve map selection/storage unit, the ratio that is used to select to satisfy the mean value of standard deviation and local maximum is less than or equal to the curve map of the requirement of predetermined threshold, as first candidate's curve map; One second candidate's curve map selection/storage unit is used to use a pre-defined algorithm to first candidate's curve map, to select and the identical curve map of second candidate's curve map; A taxon, be used to count each number of type curve map separately that belongs to second candidate's curve map, the data-signal of the number of each type curve map of output indication, determine to belong to each separately type the predefined weight of curve map and the data-signal that the output indication is applied to the weight of each type; One first designator determining unit, be used to calculate the data of number that expression belongs to the curve map of each type, with expression be applied to each type weight data product and number, determine and output result of calculation as first designator that constitutes the texture description symbol; A contrast computing unit is used for using from the mean value calculation contrast of mean value/deviate computing unit output according to formula (2), and to export the contrast that an index gauge calculates be maximum signal; One second candidate's curve map selection/storage unit, being used to respond this contrast that calculates of indication is maximum signal, output has maximum-contrast in second candidate's curve map of storage therein candidate's curve map; One second to five designator determining unit is used for determining to have at the figure of transverse axis drop shadow curve the orientation coefficient of the curve map of maximum-contrast; The orientation coefficient that has the curve map of maximum-contrast in the figure of Z-axis drop shadow curve is second designator; In the figure of transverse axis drop shadow curve, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 4th designator; The scaling ratio that in the figure of Z-axis drop shadow curve, has the curve map of maximum-contrast, be confirmed as the 5th designator, with a texture description symbol output unit, make up first designator, second to the 5th designator and predetermined vector are also exported the texture description symbol of combined result as corresponding image.
Description of drawings
It will more obvious with reference to accompanying drawing to the detailed description of embodiment that above-mentioned purpose of the present invention and advantage are passed through following:
Figure 1A and 1B are the process flow diagrams that shows visual texture description symbol restoration methods of the present invention;
Fig. 2 is the block diagram of visual texture description symbol recovery device of the present invention; With
Fig. 3 shows that according with restoration methods based on visual texture description of the present invention browses composition (PBCs) from the perception that the Brodatz texture image extracts.
Embodiment
At this, describe embodiments of the invention with reference to the accompanying drawings in detail.
Figure 1A with reference to showing visual texture description symbol restoration methods of the present invention supposes that N is a predetermined positive integer, and by N x N pixel, for example the input imagery of 128 x, 128 pixels composition uses the Gabor wave filter to be filtered (step 100).The Gabor wave filter is made up of the wave filter with different orientation coefficient and different scaling ratios.Suppose that C1 and C2 are predetermined positive, the wave filter that input imagery is had C1 kind orientation coefficient and C2 kind scaling ratio filters and the filtered image of wave filter output C1 x C2 kind.
Next, to the mean value and the deviate of the filtered visual calculating pixel of each C1 x C2, use this mean value and deviate to obtain vector Z (step 102) then.
Then, filtered image is projected on X and the Y-axis, to obtain figure of X-axis drop shadow curve and the figure of Y-axis drop shadow curve (step 104).Calculate normalized autocorrelation (NAC) value of each the curve map P (i) (i is the number from 1 to N) that represents by NAC (k) by following formula (1):
Wherein pixel location is represented by i, and the figure of drop shadow curve that represents by the pixel of pixel location i is by P (i) expression, k be one from 1 to N number (N is a positive integer).
Next, obtain (step 108) local maximum P_magn (i) and local minimum V_magn (i), the NAC that wherein calculates (k) forms peak value and valley partly at predetermined portions.
Contrast is defined as by following formula (2):
(step 110)
In addition, the curve map of selecting to satisfy following formula (3) is as first candidate's curve map (step 112):
Wherein d and S are mean value and the standard deviations of local maximum P_magn (i), and α is a predetermined threshold.
With reference to Figure 1B, the cohesion of modification is applied to first candidate's curve map to select second candidate's curve map (step 114).The agglomerative algorithm of revising be by R.O.Duda and P.E.Hart at " PatternClassification and Scene Analysis (pattern classification and scene analysis); John Wileyand Sons; New York; 1973; " the algorithm that the warp of middle cohesion of announcing is suitably revised will briefly be described below.At first, at N curve map P
1...., P
NIn, allow the mean value of the distance between peak value and the deviate be d
iAnd S
i, each curve map has corresponding to (d
i, S
i) bivector.Now, use corresponding to (d
i, S
i) bivector by following grouping P
iAbout the initial number of the N that troops, suppose that the desirable number of trooping is M
c, each C that troops
iCan be expressed as C
1={ P
1, C
2={ P
2..., C
N={ P
N.If the number of trooping is less than M
c, grouping stops.Next, obtain two C that troop of wide apart
iAnd C
jIf C
iAnd C
jBetween distance greater than predetermined threshold, grouping stops.On the contrary, merge C
iAnd C
jTo remove two in trooping one.Repeat this process and reach predetermined number up to the number of trooping.Then, through the trooping of grouping, select to have trooping and being chosen in curve map in selected the trooping as candidate's curve map of maximum curve maps.
Now, second candidate's curve map is divided into three classes (step 116).According to the curve map number that filters by wave filter, carry out classification with the scale close or identical or orientation coefficient with the scale of the wave filter that is used to filter second candidate's curve map or orientation coefficient.Below for explain convenient for the purpose of, the curve map that filters by the wave filter with definite scaling ratio and constant orientation coefficient will be called determines scaling ratio curve map or definite orientation coefficient curve map.
More specifically, at first, if one or more curve maps with the scale identical with the scale of relevant candidate's curve map or orientation coefficient or orientation coefficient are arranged and one or morely have the scale close or a curve map of orientation coefficient with the scale of relevant candidate's curve maps or orientation coefficient, relevant candidate's curve map is categorized as C1 type curve map, secondly, if one or more the have scale identical with the scale of relevant candidate's curve map or orientation coefficient or the curve maps of orientation coefficient are arranged, but do not have the scale close or the curve map of orientation coefficient with the scale of relevant candidate's curve map or orientation coefficient, relevant candidate's curve map is categorized as C2 type curve map, once more, if do not have the scale identical or close or the curve map of orientation coefficient, relevant candidate's curve map be categorized as C3 type curve map with the scale of relevant candidate's curve map or orientation coefficient.Then, counting belongs to C1, and the number of the curve map of each of C2 and C3 type to be using N1 respectively, and N2 and N3 represent, and counting belongs to C1, and the weight of the curve map of each of C2 and C3 type is to use W1 respectively, and W2 and W3 represent that it will be described below:
Now, use the several N1 that determine, N2, and N3 and weights W 1, W2, and W3, carry out column count down:
Determine that wherein M is as first indicator V that constitutes the texture description symbol as a result
1(step 118).
About second candidate's curve map, orientation coefficient and the scaling ratio of determining to have the curve map of maximum-contrast are second to the 5th designator (step 120).More specifically, the orientation coefficient of determining to have the curve map of maximum-contrast in x axial projection curve map is second indicator V
2In addition, determining to have the orientation coefficient of the curve map of maximum-contrast in y axial projection curve map, is the 3rd indicator V
3In x axial projection curve map, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 4th indicator V
4In addition, in y axial projection curve map, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 5th indicator V
5
First indicator V that use is determined in step 118
1, second to the 5th indicator V
2, V
3, V
4And V
5And, the texture description symbol is set in the vector Z that step 102 is determined, promptly texture feature vector is { [V
1, V
2, V
3, V
4, V
5], Z} (step 122).
The first big indicator V
1The senior structure of the texture of indication image.Confirmed experimentally that first designator represented the structure of visual texture well.The second and the 3rd indicator V
2And V
3Two maximum therein quantification orientations that make up of catching have been represented.The the 4th and the 5th indicator V
4And V
5Two maximum therein quantitative calibrations that make up of catching have been represented.
The texture description symbol is used as the index of image in the application of browsing or searching for recovery.Especially, the visual texture description symbol that recovers by visual texture description symbol restoration methods of the present invention is used for checker suitably, and wherein browse mode is regular, or based on the browsing of structure, even the embroidery pattern.Therefore, structurally search in the parallel pattern, can allow more to be adapted to the image search of eyes perception by using visual texture description symbol restoration methods of the present invention to based on the application of browsing based on structure.Therefore, in the designator that constitutes the texture description symbol that recovers by visual texture description symbol restoration methods of the present invention, first to the 5th indicator V
1, V
2, V
3, V
4And V
5Can be called perception and browse composition (PBCs).
In addition, about each filtered image, the mean value of calculating pixel values and deviate.By using this mean value to be called similar recovery component (SRCs) with the vector Z that deviate obtains.
In other words, in visual texture description symbol restoration methods of the present invention, the texture description symbol allows various texture structures to appear in the image of will be caught perceptually.
First indicator V of the good designator that makes up as visual texture has been described
1, represented two maximum therein the second and the 3rd indicator V of catching the quantification orientation that makes up
2And V
3, represented two maximum therein the 4th and the 5th indicator V of catching the quantitative calibration that makes up
4And V
5The texture description symbol that is used as image.Yet embodiment described above only is used to the sensation of describing and is not used as the purpose that limits.The single designator that is very suitable for the characteristic of selected arbitrarily a plurality of designators and image also can be as the texture description symbol of image.Therefore, above-described embodiment is not the restriction on the scope of the invention.
In addition, visual texture description symbol restoration methods can be programmed by computer program.Those skilled in the art can easily derive code and the sign indicating number section that constitutes computer program.In addition, program is stored in the computer-readable medium and is by computer-readable and executable, therefore, has embodied visual texture description symbol restoration methods.Medium comprises magnetic recording media, optical recording media, carrier media or the like.
Visual in addition texture description symbol restoration methods can specify by visual texture description symbol recovery device.Fig. 2 is the block scheme of the present invention's image texture description symbol recovery device.With reference to Fig. 2, visual texture description symbol recovery device comprises Gabor wave filter 200, image averaging value/deviate computing unit 202, x axial projection device 204, y axial projection device 205, NAC computing unit 206 and peak value detection/analytic unit 208.In addition, image texture description symbol recovery device comprises mean value/deviate computing unit 210, first candidate's curve map selection/storage unit 212, second candidate's curve map selection/storage unit 214, taxon 216, the first designator determining unit 218, contrast computing unit 220, the second to the 5th designator determining units 222 and texture description symbol output unit 224.
In the operation of visual texture description symbol recovery device, suppose that N is a predetermined positive, Gabor wave filter 200 uses the wave filter (not shown) with different orientation coefficient and different scaling ratios to filter by N x N pixel, the input imagery formed of 128 x, 128 pixels for example, and export filtered image (image_filtered).Suppose that C1 and C2 are predetermined positive, the wave filter that input imagery is had C1 kind orientation coefficient and C2 kind scaling ratio filters and the filtered image of wave filter output C1 x C2 kind.
Image averaging value/deviate computing unit uses this mean value and deviate to obtain the vector Z of vector Z and output acquisition to the mean value and the deviate of the filtered visual calculating pixel of each C1 x C2 then.
X axial projection device 204 and the y axial projection filtered image of device 205 projections are on X and Y-axis, to obtain figure of X-axis drop shadow curve and the figure of Y-axis drop shadow curve.In other words, suppose that pixel location represents (i is the number from 1 to N) by i, x axial projection device 204 and 205 outputs of y axial projection device by pixel location i (i-1 ..., the figure of drop shadow curve that pixel N) is represented is by P (i).
Peak value detection/analytic unit 208 detects local maximum P_magn (i) and local minimum V_magn (i), and the NAC that wherein calculates (k) forms local peaking and local valley at predetermined portions.
Mean value/deviate computing unit 210 calculates local maximum P_magn (i) mean value d and standard deviation S and exports them.First candidate's curve map selection/storage unit 212 receives mean value d and standard deviation S, and the curve map of selecting to satisfy formula (3) is first candidate's curve map (1st_CAND) and stores first selected candidate's curve map that wherein α is a predetermined threshold.
Second candidate's curve map selection/storage unit 214 is used modified cohesion and is grouped into first candidate's curve map to select and the identical curve map of second candidate's curve map (2nd-CAND).
The definite several N1 of the first indicator determining unit, 218 uses, N2, and N3 and weights W 1, the M of W2 and W3 computing formula (4) expression, and first indicator V definite and that output result of calculation accords with as the formation texture description
1
The contrast that contrast computing unit 220 is calculated by formula (2) calculating contrast and output indication is maximum signal Cont_max.
Has candidate's curve map to the second of maximum-contrast to the 5th designator determining unit 222 in second candidate's curve map of second candidate's curve map selection/storage unit, 214 outputs storage therein.
Second to the 5th designator determining unit 222 determines to have the orientation coefficient of curve map of maximum-contrast and scaling ratio as second to the 5th designator.In other words, the orientation coefficient of determining to have the curve map of maximum-contrast in x axial projection curve map is second indicator V
2In addition, determining to have the orientation coefficient of the curve map of maximum-contrast in y axial projection curve map, is the 3rd indicator V
3In x axial projection curve map, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 4th indicator V
4In addition, in y axial projection curve map, have the scaling ratio of the curve map of maximum-contrast, be confirmed as the 5th indicator V
5
Texture description symbol output unit 224 uses from first indicator V of the first designator determining unit, 218 outputs
1, second to the 5th indicator V of from second to the 5th designator determining unit 222 outputs
2, V
3, V
4And V
5With vector Z from image averaging value/deviate computing unit 202 outputs, be provided with and output texture description symbol, promptly texture feature vector is { [V
1, V
2, V
3, V
4, V
5], Z}.
Fig. 3 shows that according with restoration methods based on visual texture description of the present invention browses composition (PBCs) by simulation from the perception that the Brodatz texture image extracts.
As mentioned above, according to visual texture description symbol restoration methods of the present invention, the texture description symbol that allows various texture structures to appear in the image of will be caught by perception can be resumed.
Industrial applicibility
The present invention can be applied to the field of picture browse or search recovery application.
Claims (26)
1. method that is used to describe image texture characteristic comprises:
(a) have the predetermined filters filtered input image of different orientation coefficient and different scaling ratios by use, produce a plurality of filtered images;
(b) filtered image projection is arrived transverse axis and Z-axis so that obtain transverse axis perspective view and Z-axis perspective view;
(c) based on described transverse axis perspective view and described Z-axis perspective view, produce the systematicness of the described image of indication systematicness designator, the described image of indication direction the direction designator and indicate the scale designator of scale of the texel of described image; And
(d) use described systematicness designator, described direction designator and described scale designator, represent the texture description symbol of described image.
2. the method for claim 1, wherein step (c) comprises that the systematicness designator that produces the presentation video systematicness is in a plurality of predetermined values.
3. the method for claim 1, wherein step (c) comprises that producing it is the systematicness designator that quantizes integer.
4. the method for claim 1, wherein step (c) comprises that the direction designator that produces the expression direction is in a plurality of predetermined values.
5. the method for claim 1, wherein step (c) comprises that producing it is the direction designator that quantizes integer.
6. the method for claim 1, wherein step (c) comprises that the scale designator that produces the expression scale is in a plurality of predetermined values.
7. the method for claim 1, wherein step (c) comprises that producing it is the scale designator that quantizes integer.
8. the method for claim 1, wherein step (d) comprises that the texture description symbol with image is expressed as vector (systematicness designator, direction designator, scale designator).
9. the method for claim 1, wherein step (c) comprises that generation is the direction designator of feature with the principal direction of image.
10. method as claimed in claim 9, wherein step (c) comprise generation with the corresponding scale of the principal direction of image be the scale designator of feature.
11. method as claimed in claim 9, wherein step (c) comprises that generation is the first direction designator and the second direction designator of feature with first principal direction of image and second principal direction of image respectively.
12. method as claimed in claim 11, wherein step (c) comprises that generation is the first scale designator of feature with the corresponding scale of first principal direction with image and is the second scale designator of feature with the corresponding scale of second principal direction with image.
13. method as claimed in claim 12, wherein step (d) comprises that the texture description symbol with image is expressed as vector (systematicness designator, first direction designator, second direction designator, the first scale designator, the second scale designator).
14. a device that is used to describe image texture characteristic comprises:
Filter unit is used for having by use the predetermined filters filtered input image of different orientation coefficient and different scaling ratios, produces a plurality of filtered images;
Projecting cell is used for filtered image projection to transverse axis and Z-axis so that obtain transverse axis perspective view and Z-axis perspective view;
Generation unit, be used for based on described transverse axis perspective view and described Z-axis perspective view, produce the systematicness of the described image of indication systematicness designator, the described image of indication direction the direction designator and indicate the scale designator of scale of the texel of described image; With
The expression unit is used to use described systematicness designator, described direction designator and described scale designator, represents the texture description symbol of described image.
15. device as claimed in claim 14, wherein to produce the systematicness designator of presentation video systematicness be in a plurality of predetermined values one to generation unit.
16. device as claimed in claim 14, wherein generation unit produces the systematicness designator that it is the quantification integer.
17. device as claimed in claim 14, wherein the generation unit generation represents that the direction designator of direction is in a plurality of predetermined values.
18. device as claimed in claim 14, wherein generation unit produces the direction designator that it is the quantification integer.
19. device as claimed in claim 14, wherein the generation unit generation represents that the scale designator of scale is in a plurality of predetermined values.
20. device as claimed in claim 14, wherein generation unit produces the scale designator that it is the quantification integer.
21. device as claimed in claim 14 represents that wherein the unit is expressed as vector (systematicness designator, direction designator, scale designator) with the texture description symbol of image.
22. device as claimed in claim 14, wherein the generation unit generation is the direction designator of feature with the principal direction of image.
23. device as claimed in claim 14, wherein generation unit produce with the corresponding scale of the principal direction of image be the scale designator of feature.
24. device as claimed in claim 14, wherein the generation unit generation is the first direction designator and the second direction designator of feature with first principal direction of image and second principal direction of image respectively.
25. device as claimed in claim 24, wherein generation unit produces with the corresponding scale of first principal direction with image and is the first scale designator of feature and is the second scale designator of feature with the corresponding scale of second principal direction with image.
26. device as claimed in claim 25, wherein the texture description symbol of generation unit generation image is vector (systematicness designator, first direction designator, second direction designator, the first scale designator, the second scale designator).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11874099P | 1999-02-05 | 1999-02-05 | |
US60/118,740 | 1999-02-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB200310102673XA Division CN100405399C (en) | 1999-02-05 | 2000-02-03 | Image texture retrieving method and apparatus thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN1341248A CN1341248A (en) | 2002-03-20 |
CN100489892C true CN100489892C (en) | 2009-05-20 |
Family
ID=22380454
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB008042845A Expired - Fee Related CN100489892C (en) | 1999-02-05 | 2000-02-03 | Method and device for restoring image texture description sign for describing image texture characteristics |
CNB200310102673XA Expired - Fee Related CN100405399C (en) | 1999-02-05 | 2000-02-03 | Image texture retrieving method and apparatus thereof |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB200310102673XA Expired - Fee Related CN100405399C (en) | 1999-02-05 | 2000-02-03 | Image texture retrieving method and apparatus thereof |
Country Status (16)
Country | Link |
---|---|
US (3) | US6624821B1 (en) |
EP (3) | EP1153365B1 (en) |
JP (2) | JP2002536750A (en) |
KR (5) | KR100444778B1 (en) |
CN (2) | CN100489892C (en) |
AT (3) | ATE490518T1 (en) |
AU (1) | AU775858B2 (en) |
BR (1) | BR0007956A (en) |
CA (2) | CA2361492A1 (en) |
DE (3) | DE60036082T2 (en) |
MX (1) | MXPA01007845A (en) |
MY (2) | MY128897A (en) |
NZ (1) | NZ513144A (en) |
SG (1) | SG142124A1 (en) |
TW (1) | TW528990B (en) |
WO (1) | WO2000046750A1 (en) |
Families Citing this family (109)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2003252765B2 (en) * | 1999-02-05 | 2006-06-29 | Samsung Electronics Co., Ltd. | Image texture retrieving method and apparatus thereof |
KR100308456B1 (en) | 1999-07-09 | 2001-11-02 | 오길록 | Texture description method and texture retrieval method in frequency space |
KR100355404B1 (en) * | 1999-12-03 | 2002-10-11 | 삼성전자 주식회사 | Texture description method and texture retrieval method using Gabor filter in frequency domain |
US6977659B2 (en) * | 2001-10-11 | 2005-12-20 | At & T Corp. | Texture replacement in video sequences and images |
US7606435B1 (en) | 2002-02-21 | 2009-10-20 | At&T Intellectual Property Ii, L.P. | System and method for encoding and decoding using texture replacement |
KR100908384B1 (en) * | 2002-06-25 | 2009-07-20 | 주식회사 케이티 | Region-based Texture Extraction Apparatus Using Block Correlation Coefficient and Its Method |
JP2005122361A (en) * | 2003-10-15 | 2005-05-12 | Sony Computer Entertainment Inc | Image processor, its processing method, computer program, and recording medium |
US7415145B2 (en) * | 2003-12-30 | 2008-08-19 | General Electric Company | Methods and apparatus for artifact reduction |
JP4747881B2 (en) * | 2006-02-27 | 2011-08-17 | セイコーエプソン株式会社 | A data conversion method, a texture creation method, a program, a recording medium, and a projector using an arithmetic processing unit. |
WO2007145941A2 (en) * | 2006-06-06 | 2007-12-21 | Tolerrx, Inc. | Administration of anti-cd3 antibodies in the treatment of autoimmune diseases |
EP1870836A1 (en) * | 2006-06-22 | 2007-12-26 | THOMSON Licensing | Method and device to determine a descriptor for a signal representing a multimedia item, device for retrieving items in a database, device for classification of multimedia items in a database |
JP5358083B2 (en) * | 2007-11-01 | 2013-12-04 | 株式会社日立製作所 | Person image search device and image search device |
US7924290B2 (en) * | 2007-05-30 | 2011-04-12 | Nvidia Corporation | Method and system for processing texture samples with programmable offset positions |
KR101394154B1 (en) * | 2007-10-16 | 2014-05-14 | 삼성전자주식회사 | Method and apparatus for encoding media data and metadata thereof |
JO3076B1 (en) | 2007-10-17 | 2017-03-15 | Janssen Alzheimer Immunotherap | Immunotherapy regimes dependent on apoe status |
US8483431B2 (en) | 2008-05-27 | 2013-07-09 | Samsung Electronics Co., Ltd. | System and method for estimating the centers of moving objects in a video sequence |
US20140321756A9 (en) * | 2008-05-27 | 2014-10-30 | Samsung Electronics Co., Ltd. | System and method for circling detection based on object trajectory |
US8107726B2 (en) * | 2008-06-18 | 2012-01-31 | Samsung Electronics Co., Ltd. | System and method for class-specific object segmentation of image data |
US20100027845A1 (en) * | 2008-07-31 | 2010-02-04 | Samsung Electronics Co., Ltd. | System and method for motion detection based on object trajectory |
US8433101B2 (en) * | 2008-07-31 | 2013-04-30 | Samsung Electronics Co., Ltd. | System and method for waving detection based on object trajectory |
US8073818B2 (en) * | 2008-10-03 | 2011-12-06 | Microsoft Corporation | Co-location visual pattern mining for near-duplicate image retrieval |
KR101194605B1 (en) * | 2008-12-22 | 2012-10-25 | 한국전자통신연구원 | Apparatus and method for synthesizing time-coherent texture |
KR101028628B1 (en) | 2008-12-29 | 2011-04-11 | 포항공과대학교 산학협력단 | Image texture filtering method, storage medium of storing program for executing the same and apparatus performing the same |
US8321422B1 (en) | 2009-04-23 | 2012-11-27 | Google Inc. | Fast covariance matrix generation |
US8611695B1 (en) | 2009-04-27 | 2013-12-17 | Google Inc. | Large scale patch search |
US8396325B1 (en) | 2009-04-27 | 2013-03-12 | Google Inc. | Image enhancement through discrete patch optimization |
US8391634B1 (en) * | 2009-04-28 | 2013-03-05 | Google Inc. | Illumination estimation for images |
US8385662B1 (en) | 2009-04-30 | 2013-02-26 | Google Inc. | Principal component analysis based seed generation for clustering analysis |
US8798393B2 (en) | 2010-12-01 | 2014-08-05 | Google Inc. | Removing illumination variation from images |
LT2648752T (en) | 2010-12-06 | 2017-04-10 | Seattle Genetics, Inc. | Humanized antibodies to liv-1 and use of same to treat cancer |
US8738280B2 (en) * | 2011-06-09 | 2014-05-27 | Autotalks Ltd. | Methods for activity reduction in pedestrian-to-vehicle communication networks |
PL2771031T3 (en) | 2011-10-28 | 2018-09-28 | Prothena Biosciences Limited Co. | Humanized antibodies that recognize alpha-synuclein |
WO2013112945A1 (en) | 2012-01-27 | 2013-08-01 | Neotope Biosciences Limited | Humanized antibodies that recognize alpha-synuclein |
US20130309223A1 (en) | 2012-05-18 | 2013-11-21 | Seattle Genetics, Inc. | CD33 Antibodies And Use Of Same To Treat Cancer |
UA118441C2 (en) | 2012-10-08 | 2019-01-25 | Протена Біосаєнсиз Лімітед | Antibodies recognizing alpha-synuclein |
EP2970453B1 (en) | 2013-03-13 | 2019-12-04 | Prothena Biosciences Limited | Tau immunotherapy |
US10513555B2 (en) | 2013-07-04 | 2019-12-24 | Prothena Biosciences Limited | Antibody formulations and methods |
WO2015004633A1 (en) | 2013-07-12 | 2015-01-15 | Neotope Biosciences Limited | Antibodies that recognize islet-amyloid polypeptide (iapp) |
WO2015004632A1 (en) | 2013-07-12 | 2015-01-15 | Neotope Biosciences Limited | Antibodies that recognize iapp |
KR101713690B1 (en) * | 2013-10-25 | 2017-03-08 | 한국전자통신연구원 | Effective visual descriptor extraction method and system using feature selection |
JP2017501848A (en) | 2013-11-19 | 2017-01-19 | プロセナ バイオサイエンシーズ リミテッド | Monitoring immunotherapy of Lewy body disease from constipation symptoms |
EP3116911B8 (en) | 2014-03-12 | 2019-10-23 | Prothena Biosciences Limited | Anti-mcam antibodies and associated methods of use |
US10059761B2 (en) | 2014-03-12 | 2018-08-28 | Prothena Biosciences Limited | Anti-Laminin4 antibodies specific for LG4-5 |
TW201623331A (en) | 2014-03-12 | 2016-07-01 | 普羅帝納生物科學公司 | Anti-MCAM antibodies and associated methods of use |
CA2938931A1 (en) | 2014-03-12 | 2015-09-17 | Prothena Biosciences Limited | Anti-laminin4 antibodies specific for lg1-3 |
WO2015136468A1 (en) | 2014-03-13 | 2015-09-17 | Prothena Biosciences Limited | Combination treatment for multiple sclerosis |
CA2944402A1 (en) | 2014-04-08 | 2015-10-15 | Prothena Biosciences Limited | Blood-brain barrier shuttles containing antibodies recognizing alpha-synuclein |
US9840553B2 (en) | 2014-06-28 | 2017-12-12 | Kodiak Sciences Inc. | Dual PDGF/VEGF antagonists |
KR102260805B1 (en) * | 2014-08-06 | 2021-06-07 | 삼성전자주식회사 | Image searching device and method thereof |
US20160075772A1 (en) | 2014-09-12 | 2016-03-17 | Regeneron Pharmaceuticals, Inc. | Treatment of Fibrodysplasia Ossificans Progressiva |
KR102258100B1 (en) * | 2014-11-18 | 2021-05-28 | 삼성전자주식회사 | Method and apparatus for processing texture |
TWI718122B (en) | 2015-01-28 | 2021-02-11 | 愛爾蘭商普羅佘納生物科技有限公司 | Anti-transthyretin antibodies |
TWI769570B (en) | 2015-01-28 | 2022-07-01 | 愛爾蘭商普羅佘納生物科技有限公司 | Anti-transthyretin antibodies |
TWI781507B (en) | 2015-01-28 | 2022-10-21 | 愛爾蘭商普羅佘納生物科技有限公司 | Anti-transthyretin antibodies |
WO2016176341A1 (en) | 2015-04-29 | 2016-11-03 | Regeneron Pharmaceuticals, Inc. | Treatment of fibrodysplasia ossificans progressiva |
US10162878B2 (en) | 2015-05-21 | 2018-12-25 | Tibco Software Inc. | System and method for agglomerative clustering |
CN107637064A (en) | 2015-06-08 | 2018-01-26 | 深圳市大疆创新科技有限公司 | Method and apparatus for image procossing |
KR101627974B1 (en) * | 2015-06-19 | 2016-06-14 | 인하대학교 산학협력단 | Method and Apparatus for Producing of Blur Invariant Image Feature Descriptor |
EP4302784A3 (en) | 2015-06-30 | 2024-03-13 | Seagen Inc. | Anti-ntb-a antibodies and related compositions and methods |
CN105183752B (en) * | 2015-07-13 | 2018-08-10 | 中国电子科技集团公司第十研究所 | The method of correlation inquiry Infrared video image specific content |
WO2017046774A2 (en) | 2015-09-16 | 2017-03-23 | Prothena Biosciences Limited | Use of anti-mcam antibodies for treatment or prophylaxis of giant cell arteritis, polymyalgia rheumatica or takayasu's arteritis |
CA2998716A1 (en) | 2015-09-16 | 2017-03-23 | Prothena Biosciences Limited | Use of anti-mcam antibodies for treatment or prophylaxis of giant cell arteritis, polymyalgia rheumatica or takayasu's arteritis |
IL290457B1 (en) | 2015-12-30 | 2024-10-01 | Kodiak Sciences Inc | Antibodies and conjugates thereof |
WO2017149513A1 (en) | 2016-03-03 | 2017-09-08 | Prothena Biosciences Limited | Anti-mcam antibodies and associated methods of use |
CA3014934A1 (en) | 2016-03-04 | 2017-09-08 | JN Biosciences, LLC | Antibodies to tigit |
WO2017153953A1 (en) | 2016-03-09 | 2017-09-14 | Prothena Biosciences Limited | Use of anti-mcam antibodies for treatment or prophylaxis of granulomatous lung diseases |
WO2017153955A1 (en) | 2016-03-09 | 2017-09-14 | Prothena Biosciences Limited | Use of anti-mcam antibodies for treatment or prophylaxis of granulomatous lung diseases |
CU24537B1 (en) | 2016-05-02 | 2021-07-02 | Prothena Biosciences Ltd | MONOCLONAL ANTIBODIES COMPETING TO JOIN HUMAN TAU WITH THE 3D6 ANTIBODY |
WO2017191559A1 (en) | 2016-05-02 | 2017-11-09 | Prothena Biosciences Limited | Tau immunotherapy |
CU24538B1 (en) | 2016-05-02 | 2021-08-06 | Prothena Biosciences Ltd | MONOCLONAL ANTIBODIES COMPETING TO JOIN HUMAN TAU WITH THE 16G7 ANTIBODY |
WO2017208210A1 (en) | 2016-06-03 | 2017-12-07 | Prothena Biosciences Limited | Anti-mcam antibodies and associated methods of use |
JP7016470B2 (en) | 2016-07-02 | 2022-02-07 | プロセナ バイオサイエンシーズ リミテッド | Anti-transthyretin antibody |
JP7017013B2 (en) | 2016-07-02 | 2022-02-08 | プロセナ バイオサイエンシーズ リミテッド | Anti-transthyretin antibody |
WO2018007922A2 (en) | 2016-07-02 | 2018-01-11 | Prothena Biosciences Limited | Anti-transthyretin antibodies |
WO2018191548A2 (en) | 2017-04-14 | 2018-10-18 | Kodiak Sciences Inc. | Complement factor d antagonist antibodies and conjugates thereof |
IL270375B1 (en) | 2017-05-02 | 2024-08-01 | Prothena Biosciences Ltd | Antibodies recognizing tau |
AU2017434556A1 (en) | 2017-09-28 | 2020-04-09 | F. Hoffmann-La Roche Ag | Dosing regimes for treatment of synucleinopathies |
EP3508499A1 (en) | 2018-01-08 | 2019-07-10 | iOmx Therapeutics AG | Antibodies targeting, and other modulators of, an immunoglobulin gene associated with resistance against anti-tumour immune responses, and uses thereof |
MX2020009152A (en) | 2018-03-02 | 2020-11-09 | Kodiak Sciences Inc | Il-6 antibodies and fusion constructs and conjugates thereof. |
CN112638944A (en) | 2018-08-23 | 2021-04-09 | 西进公司 | anti-TIGIT antibody |
CR20210272A (en) | 2018-11-26 | 2021-07-14 | Forty Seven Inc | HUMANIZED http://aplpatentes:48080/IpasWeb/PatentEdit/ViewPatentEdit.do#ANTIBODIES AGAINST C-KIT |
CA3120570A1 (en) | 2018-11-28 | 2020-06-04 | Forty Seven, Inc. | Genetically modified hspcs resistant to ablation regime |
CN109670423A (en) * | 2018-12-05 | 2019-04-23 | 依通(北京)科技有限公司 | A kind of image identification system based on deep learning, method and medium |
JP2022519273A (en) | 2019-02-05 | 2022-03-22 | シージェン インコーポレイテッド | Anti-CD228 antibody and antibody drug conjugate |
CU20210073A7 (en) | 2019-03-03 | 2022-04-07 | Prothena Biosciences Ltd | ANTIBODIES THAT BIND WITHIN THE CDRS-DEFINED MICROTUBULE-BINDING REGION OF TAU |
EP3994171A1 (en) | 2019-07-05 | 2022-05-11 | iOmx Therapeutics AG | Antibodies binding igc2 of igsf11 (vsig3) and uses thereof |
WO2021067776A2 (en) | 2019-10-04 | 2021-04-08 | Seagen Inc. | Anti-pd-l1 antibodies and antibody-drug conjugates |
CA3157509A1 (en) | 2019-10-10 | 2021-04-15 | Kodiak Sciences Inc. | Methods of treating an eye disorder |
EP3822288A1 (en) | 2019-11-18 | 2021-05-19 | Deutsches Krebsforschungszentrum, Stiftung des öffentlichen Rechts | Antibodies targeting, and other modulators of, the cd276 antigen, and uses thereof |
EP4087652A1 (en) | 2020-01-08 | 2022-11-16 | Regeneron Pharmaceuticals, Inc. | Treatment of fibrodysplasia ossificans progressiva |
KR20230005163A (en) | 2020-03-26 | 2023-01-09 | 씨젠 인크. | How to treat multiple myeloma |
US11820824B2 (en) | 2020-06-02 | 2023-11-21 | Arcus Biosciences, Inc. | Antibodies to TIGIT |
EP4175668A1 (en) | 2020-07-06 | 2023-05-10 | iOmx Therapeutics AG | Antibodies binding igv of igsf11 (vsig3) and uses thereof |
KR20230042518A (en) | 2020-08-04 | 2023-03-28 | 씨젠 인크. | Anti-CD228 Antibodies and Antibody-Drug Conjugates |
JP2023547507A (en) | 2020-11-03 | 2023-11-10 | ドイチェス クレブスフォルシュンクスツェントルム スチフトゥング デス エッフェントリヒェン レヒツ | Target cell-restricted and co-stimulatory bispecific and bivalent anti-CD28 antibody |
KR20230147099A (en) | 2021-01-28 | 2023-10-20 | 백신벤트 게엠베하 | METHOD AND MEANS FOR MODULATING B-CELL MEDIATED IMMUNE RESPONSES |
CN117120084A (en) | 2021-01-28 | 2023-11-24 | 维肯芬特有限责任公司 | Methods and means for modulating B cell mediated immune responses |
WO2022162203A1 (en) | 2021-01-28 | 2022-08-04 | Vaccinvent Gmbh | Method and means for modulating b-cell mediated immune responses |
AU2022254727A1 (en) | 2021-04-09 | 2023-10-12 | Seagen Inc. | Methods of treating cancer with anti-tigit antibodies |
TW202327650A (en) | 2021-09-23 | 2023-07-16 | 美商思進公司 | Methods of treating multiple myeloma |
WO2023201268A1 (en) | 2022-04-13 | 2023-10-19 | Gilead Sciences, Inc. | Combination therapy for treating tumor antigen expressing cancers |
AU2023252914A1 (en) | 2022-04-13 | 2024-10-17 | Arcus Biosciences, Inc. | Combination therapy for treating trop-2 expressing cancers |
TW202409083A (en) | 2022-05-02 | 2024-03-01 | 美商阿克思生物科學有限公司 | Anti-tigit antibodies and uses of the same |
WO2024068777A1 (en) | 2022-09-28 | 2024-04-04 | Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts | Modified ace2 proteins with improved activity against sars-cov-2 |
WO2024097816A1 (en) | 2022-11-03 | 2024-05-10 | Seagen Inc. | Anti-avb6 antibodies and antibody-drug conjugates and their use in the treatment of cancer |
WO2024108053A1 (en) | 2022-11-17 | 2024-05-23 | Sanofi | Ceacam5 antibody-drug conjugates and methods of use thereof |
WO2024133940A2 (en) | 2022-12-23 | 2024-06-27 | Iomx Therapeutics Ag | Cross-specific antigen binding proteins (abp) targeting leukocyte immunoglobulin-like receptor subfamily b1 (lilrb1) and lilrb2, combinations and uses thereof |
WO2024157085A1 (en) | 2023-01-26 | 2024-08-02 | Othair Prothena Limited | Methods of treating neurological disorders with anti-abeta antibodies |
WO2024191807A1 (en) | 2023-03-10 | 2024-09-19 | Seagen Inc. | Methods of treating cancer with anti-tigit antibodies |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3010588B2 (en) * | 1991-05-01 | 2000-02-21 | 松下電器産業株式会社 | Pattern positioning device and pattern classification device |
US5579471A (en) | 1992-11-09 | 1996-11-26 | International Business Machines Corporation | Image query system and method |
US5659626A (en) * | 1994-10-20 | 1997-08-19 | Calspan Corporation | Fingerprint identification system |
AU4985096A (en) * | 1995-03-02 | 1996-09-18 | Parametric Technology Corporation | Computer graphics system for creating and enhancing texture maps |
JPH09101970A (en) * | 1995-10-06 | 1997-04-15 | Omron Corp | Method and device for retrieving image |
JP3645024B2 (en) * | 1996-02-06 | 2005-05-11 | 株式会社ソニー・コンピュータエンタテインメント | Drawing apparatus and drawing method |
JPH09251554A (en) * | 1996-03-18 | 1997-09-22 | Nippon Telegr & Teleph Corp <Ntt> | Image processor |
JP3609225B2 (en) * | 1996-11-25 | 2005-01-12 | 日本電信電話株式会社 | Similar object retrieval device |
US6381365B2 (en) * | 1997-08-22 | 2002-04-30 | Minolta Co., Ltd. | Image data processing apparatus and image data processing method |
US6192150B1 (en) * | 1998-11-16 | 2001-02-20 | National University Of Singapore | Invariant texture matching method for image retrieval |
US6424741B1 (en) * | 1999-03-19 | 2002-07-23 | Samsung Electronics Co., Ltd. | Apparatus for analyzing image texture and method therefor |
US6594391B1 (en) * | 1999-09-03 | 2003-07-15 | Lucent Technologies Inc. | Method and apparatus for texture analysis and replicability determination |
KR100788642B1 (en) * | 1999-10-01 | 2007-12-26 | 삼성전자주식회사 | Texture analysing method of digital image |
KR100355404B1 (en) * | 1999-12-03 | 2002-10-11 | 삼성전자 주식회사 | Texture description method and texture retrieval method using Gabor filter in frequency domain |
-
2000
- 2000-02-03 KR KR10-2004-7001082A patent/KR100444778B1/en not_active IP Right Cessation
- 2000-02-03 AU AU24646/00A patent/AU775858B2/en not_active Ceased
- 2000-02-03 SG SG200305770-0A patent/SG142124A1/en unknown
- 2000-02-03 DE DE60036082T patent/DE60036082T2/en not_active Expired - Lifetime
- 2000-02-03 AT AT04000801T patent/ATE490518T1/en not_active IP Right Cessation
- 2000-02-03 WO PCT/KR2000/000091 patent/WO2000046750A1/en active IP Right Grant
- 2000-02-03 KR KR10-2001-7009442A patent/KR100444776B1/en not_active IP Right Cessation
- 2000-02-03 CN CNB008042845A patent/CN100489892C/en not_active Expired - Fee Related
- 2000-02-03 MY MYPI20000387A patent/MY128897A/en unknown
- 2000-02-03 BR BR0007956-1A patent/BR0007956A/en not_active IP Right Cessation
- 2000-02-03 DE DE60045319T patent/DE60045319D1/en not_active Expired - Lifetime
- 2000-02-03 CN CNB200310102673XA patent/CN100405399C/en not_active Expired - Fee Related
- 2000-02-03 KR KR10-2004-7001081A patent/KR100444777B1/en not_active IP Right Cessation
- 2000-02-03 AT AT07101113T patent/ATE385007T1/en not_active IP Right Cessation
- 2000-02-03 CA CA002361492A patent/CA2361492A1/en not_active Abandoned
- 2000-02-03 MX MXPA01007845A patent/MXPA01007845A/en active IP Right Grant
- 2000-02-03 NZ NZ513144A patent/NZ513144A/en unknown
- 2000-02-03 MY MYPI20034720A patent/MY138084A/en unknown
- 2000-02-03 AT AT00903006T patent/ATE371231T1/en not_active IP Right Cessation
- 2000-02-03 DE DE60037919T patent/DE60037919T2/en not_active Expired - Lifetime
- 2000-02-03 KR KR10-2004-7001078A patent/KR100452064B1/en not_active IP Right Cessation
- 2000-02-03 KR KR10-2004-7001080A patent/KR100483832B1/en not_active IP Right Cessation
- 2000-02-03 EP EP00903006A patent/EP1153365B1/en not_active Expired - Lifetime
- 2000-02-03 CA CA002625839A patent/CA2625839A1/en not_active Abandoned
- 2000-02-03 EP EP07101113A patent/EP1777658B1/en not_active Expired - Lifetime
- 2000-02-03 JP JP2000597758A patent/JP2002536750A/en active Pending
- 2000-02-03 EP EP04000801A patent/EP1453000B1/en not_active Expired - Lifetime
- 2000-02-04 US US09/497,504 patent/US6624821B1/en not_active Expired - Lifetime
- 2000-06-23 TW TW089103999A patent/TW528990B/en not_active IP Right Cessation
-
2003
- 2003-05-09 US US10/434,150 patent/US7027065B2/en not_active Expired - Fee Related
-
2004
- 2004-02-05 JP JP2004029900A patent/JP2004158042A/en active Pending
- 2004-03-10 US US10/795,991 patent/US7199803B2/en not_active Expired - Lifetime
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100489892C (en) | Method and device for restoring image texture description sign for describing image texture characteristics | |
US20120002881A1 (en) | Image management device, image management method, program, recording medium, and integrated circuit | |
CN105989358A (en) | Natural scene video identification method | |
CN105869175A (en) | Image segmentation method and system | |
CA2326631A1 (en) | Representative color designating method using reliability | |
CN101727580A (en) | Image processing apparatus, electronic medium, and image processing method | |
CN115272652A (en) | Dense object image detection method based on multiple regression and adaptive focus loss | |
CN115690500A (en) | Based on improve U 2 Network instrument identification method | |
CN114205766A (en) | Method for detecting and positioning abnormal node of wireless sensor network | |
CN117522735A (en) | Multi-scale-based dense-flow sensing rain-removing image enhancement method | |
CN117495825A (en) | Method for detecting foreign matters on tower pole of transformer substation | |
CN110689071B (en) | Target detection system and method based on structured high-order features | |
CN108010076A (en) | A kind of end face appearance modeling method towards intensive industry bar image detection | |
AU2003252765B2 (en) | Image texture retrieving method and apparatus thereof | |
CN118298404A (en) | Traffic sign detection method based on improved YOLOv s model | |
CN118262347A (en) | Pollution instrument reading method based on pointer generation | |
CN116935029A (en) | Remote sensing image rotation target detection method based on deep learning | |
Isenberg et al. | Quantitative Evaluation for Edge Bundling Based on Structural Aesthetics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C06 | Publication | ||
PB01 | Publication | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20090520 Termination date: 20180203 |
|
CF01 | Termination of patent right due to non-payment of annual fee |