CN104217221A - Method for detecting calligraphy and paintings based on textural features - Google Patents
Method for detecting calligraphy and paintings based on textural features Download PDFInfo
- Publication number
- CN104217221A CN104217221A CN201410428074.5A CN201410428074A CN104217221A CN 104217221 A CN104217221 A CN 104217221A CN 201410428074 A CN201410428074 A CN 201410428074A CN 104217221 A CN104217221 A CN 104217221A
- Authority
- CN
- China
- Prior art keywords
- painting
- calligraphy
- image
- calligraphy pieces
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Image Analysis (AREA)
Abstract
The invention provides a method for detecting calligraphy and paintings based on textural features. The method comprises the following steps of: obtaining a Chinese art paper digital image of authentic calligraphy or an authentic painting; preprocessing the Chinese art paper digital image, and thereby obtaining a preprocessed Chinese art paper digital image; extracting the textural features by using an SURF algorithm; carrying out textural feature matching on calligraphy or a painting to be detected, if matching, determining that the calligraphy or painting to be detected is an authentic work; and otherwise, determining that the calligraphy or painting to be detected is a fake. According to the method for detecting calligraphy and paintings based on the textural features provided by the invention, when the method is used for detecting authenticity or fallacy of the contemporary calligraphy or painting, the features of the authentic calligraphy or painting are extracted, cluster centers are obtained by using a bidirectional FLANN algorithm in combination with feature point clustering, a square area is captured by adopting each cluster center as a center, and authenticity or fallacy of the calligraphy or painting is judged by the highest similarity of the square areas, thus being capable of rapidly and accurately distinguishing authenticity or fallacy of the calligraphy and paintings, and preventing lawbreakers from faking the calligraphy and paintings.
Description
Technical field
The present invention relates to a kind of painting and calligraphy pieces detection method, particularly a kind of painting and calligraphy pieces detection method based on textural characteristics, for detecting the true or false of contemporary painting and calligraphy pieces.
Background technology
Painting and calligraphy pieces in Chinese art works since today from ancient always in occupation of one seat, according to related data display, the share of painting and calligraphy pieces shared by Chinese art product auction marketplace reach especially percent seventy or eighty.
In recent years, along with improving constantly of socioeconomic development and living standards of the people quality, more and more people people teacher pursues calligraphy and painting art.Just start on the market to continue to bring out various famous expert's painting and calligraphy, but the discriminating people of true and false painting and calligraphy are also known little about it.
The development of science and technology makes printing technology also obtain the progress leaped, just provide the chance of seeking profit to lawless person, they can utilize high-tech printing equipment to clone famous expert's painting and calligraphy pieces easily, are more prone to than handmade cloning, quantity is also more, therefore endangers also just larger.The Sichuan cultural appraisal and evaluation company limited executive director Pan Baoqing of logical celebrating once said, existing Chinese art product auction marketplace is a little as a runaway wild horse! Fakement, counterfeit are flooded with market, want that it is extremely difficult for having bought real certified products.
According to the display of relevant data, the fakement of Chinese Painting and Calligraphy is full of in end arts auction market, and the fakement rate of well-known writer's painting and calligraphy pieces is also higher.And those little artwork auction rooms, the indivedual special shows wherein released are difficult to see authentic painting unexpectedly.Therefore, authenticity technology delayed of the famous and precious collectibles such as existing market calligraphy and painting hinders and restrict one of its principal contradiction in cultural creative market circulation and transaction.
Existing anti-counterfeiting technology mainly comprises infrared anti-false technology, DNA anti-counterfeiting technology and fingerprint identification technology.
Infrared anti-false technology is anti-counterfeiting technology comparatively early.It uses infrared technology to be made into living human eye visible or sightless word, figure, bar code etc., recycles special infrared ink powder, ink, stamp-pad ink be printed on painting and calligraphy pieces.Like this, can detect by specific apparatus when checking.
DNA anti-counterfeiting technology is relevant to biological knowledge, it is on works, transplant special DNA segment, article are made also to have possessed the characteristic with DNA, due to the corresponding one group of gene of a kind of article, and often pair of gene is all unique, such article are just provided with unique feature, thus degradation checks difficulty greatly.
Fingerprint identification technology, after expert of calligraphy and painting completes the works of oneself, just with finger-dipping the fingerprint die one piece that oneself is added a cover by ink paste below seal.So just can carry out painting and calligraphy qualification by fingerprint recognition.
But said method is all artificial add the true or false that material beyond painting and calligraphy carrys out auxiliary judgment painting and calligraphy, still there is lawless person and utilizes high-tech to forge the possibility of painting and calligraphy.
Those skilled in the art be devoted to provide a kind of quick and precisely can distinguish painting and calligraphy true or false and the method for anti-counterfeit of the chance of not seeking loopholes to undesirable and system thereof, thus effectively can guarantee that auction room's painting and calligraphy are all genuine pieces.
Summary of the invention
The object of the present invention is to provide a kind of detection method of the painting and calligraphy pieces based on textural characteristics, for detecting the true or false of contemporary painting and calligraphy pieces, after author completes painting and calligraphy pieces, feature extraction is carried out to painting and calligraphy pieces authentic work, in the testing process to painting and calligraphy pieces to be detected, utilize the feature extracted from painting and calligraphy pieces authentic work, adopt two-way FLANN algorithm and combine and again feature points clustering is obtained Mei Lei center, be that scope intercepts square region with center, the true or false of whole painting and calligraphy is judged with the similarity of the highest square region, can be quick, distinguish the true and false of painting and calligraphy pieces exactly.
The invention provides a kind of detection method of the painting and calligraphy pieces based on textural characteristics, comprise the following steps:
(1) the rice paper digital picture of painting and calligraphy pieces authentic work is obtained;
(2) pre-service is carried out to rice paper digital picture, obtain pretreated rice paper digital picture;
(3) SURF algorithm is utilized, texture feature extraction from pretreated rice paper digital picture;
(4) carry out textural characteristics coupling to painting and calligraphy pieces to be detected, if coupling, then painting and calligraphy pieces to be detected is painting and calligraphy pieces authentic work; If do not mated, then painting and calligraphy pieces to be detected is fakement.
Further, two-way Flann matching algorithm is adopted to carry out textural characteristics coupling to painting and calligraphy pieces to be detected in step (4).
Further, carry out textural characteristics coupling in step (4) before further comprising the steps of:
(41) utilize improvement K-Means algorithm to carry out cluster, obtain the center of every class;
(42) by the center of the class of acquisition in step (41), the same area of painting and calligraphy pieces authentic work and painting and calligraphy pieces to be detected is extracted.
Further, step (2) to rice paper digital picture carry out pre-service adopt cromogram turn gray-scale map, histogram equalization, morphological transformation, rim detection or Gabor filter in the combination of one or more.
Further, step (3) utilizes SURF algorithm, and texture feature extraction from pretreated rice paper digital picture, comprises the following steps:
(31) integral image is utilized to calculate Haar feature;
(32) construct Hessian matrix, obtain extreme point;
(33) adopt gaussian filtering have found basis for estimation that whether point (x, y) is Local Extremum;
(34) extract minutiae;
(35) unique point characteristic direction is obtained;
(36) construct SURF unique point and describe operator.
Further, step (35) acquisition characteristic direction comprises the following steps:
(351) take unique point as the center of circle, with six of the yardstick at unique point place times for radius, set up circle region;
(352) region is justified with the interval scan of 60 degree of sector regions and setting;
(353) maximal value of the little wave response of harr is four times of the yardstick at unique point place, and give weight according to the distance in the distance center of circle, the distance center of circle is far away, and weight is less;
(354) the Harr small echo response vector of point each in sector region is added;
(355) find Harr small echo response vector and maximum sector region, the direction vector of sector region is the characteristic direction of unique point.
Further, the method for step (34) extract minutiae comprises the following steps:
(341) Local Extremum and 18 points being arranged in 8 adjacent points of same yardstick and upper and lower two adjacent yardsticks are compared, if Local Extremum is extreme point, then Local Extremum is preliminary extreme point;
(342) after obtaining whole preliminary unique point, obtain other unique point of sub-pixel by linear difference approach, then remove the point being less than setting threshold value, obtain final unique point.
Compared with prior art, the detection method of the painting and calligraphy pieces based on textural characteristics provided by the invention has following beneficial effect:
(1) after author completes painting and calligraphy pieces, feature extraction is carried out to painting and calligraphy pieces authentic work, in the testing process to painting and calligraphy pieces to be detected, utilize the feature extracted from painting and calligraphy pieces authentic work, adopt two-way FLANN algorithm and combine and again feature points clustering is obtained Mei Lei center, be that scope intercepts square region with center, judge the true or false of whole painting and calligraphy with the similarity of the highest square region, the true and false of contemporary painting and calligraphy pieces can be distinguished quickly and accurately;
(2) not needing people for adding material beyond painting and calligraphy, carrying out the true or false of auxiliary judgment painting and calligraphy pieces, because the rice paper textural characteristics of painting and calligraphy pieces authentic work almost cannot copy, thus can prevent lawless person from utilizing high-tech to forge painting and calligraphy pieces.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of one embodiment of the present of invention;
Fig. 2 be before histogram equalization based on optical microscope image;
Fig. 3 be after histogram equalization based on optical microscope image;
Fig. 4 be before histogram equalization based on electron microscope image;
Fig. 5 be after histogram equalization based on electron microscope image;
Fig. 6 is the image before erosion operation;
Fig. 7 is the corrosion structure of erosion operation;
Fig. 8 is the image after erosion operation;
Fig. 9 carries out the image after image threshold based on optical microscope image shown in Fig. 3;
Figure 10 carries out the image after opening operation based on optical microscope image shown in Fig. 9;
Figure 11 carries out the image after image threshold based on electron microscope image shown in Fig. 5;
Figure 12 carries out the image after opening operation based on electron microscope image shown in Figure 11;
Figure 13 carries out the image after Lapalace edge detection based on optical microscope image shown in Fig. 9;
Figure 14 carries out the image after Lapalace edge detection based on electron microscope image shown in Figure 11;
Figure 15 carries out the image after canny rim detection based on optical microscope image shown in Fig. 9;
Figure 16 carries out the image after canny rim detection based on electron microscope image shown in Figure 11;
Figure 17 carries out the filtered image of gabor based on optical microscope image shown in Fig. 9;
Figure 18 carries out the filtered image of gabor based on electron microscope image shown in Figure 11;
Figure 19 is Haar feature classification figure;
Figure 20 is integration key diagram;
Figure 21 is the y direction template of gaussian filtering;
Figure 22 is the second order mixing local derviation template of gaussian filtering;
Figure 23 is yardstick pyramid;
Figure 24 is yardstick pyramid;
Figure 25 is the location diagram of Local Extremum consecutive point in yardstick pyramid;
Figure 26 a is characteristic direction schematic diagram;
Figure 26 b is characteristic direction schematic diagram;
Figure 26 c is characteristic direction schematic diagram;
Figure 27 is operator direction schematic diagram;
Figure 28 organizes pretreated matching result without spin;
Figure 29 has the pretreated matching result of rotation group;
Figure 30 is the effect after texture characteristic points being gathered four classes;
Figure 31 is by poly-for texture characteristic points four classes and extracts area results.
Embodiment
Be below specific embodiments of the invention and by reference to the accompanying drawings, technical scheme of the present invention is further described, but the present invention be not limited to following examples.
As shown in Figure 1, the detection method of the painting and calligraphy pieces based on textural characteristics of one embodiment of the present of invention, comprises the following steps:
(1) the rice paper digital picture of painting and calligraphy pieces genuine piece is obtained;
(2) pre-service is carried out to rice paper digital picture, obtain pretreated rice paper digital picture;
(3) SURF algorithm is utilized, texture feature extraction from pretreated rice paper digital picture;
(4) carry out textural characteristics coupling to painting and calligraphy pieces to be detected, if coupling, then painting and calligraphy pieces to be detected is genuine piece; If do not mated, then painting and calligraphy pieces to be detected is imitative product.
Obtain the rice paper digital picture of painting and calligraphy pieces genuine piece in step (1), optical microscope or electron microscope can be adopted to obtain.
Optical microscope utilizes optical principle exactly, and the trickle object that cannot be distinguished by human eye is enlarged into the visible image of human eye, then by a series of process, transfers image to digital picture.Because rice paper is that its optical imagery contains much impurity, and contrast is low, and stereovision is poor, and therefore textural characteristics is not obvious by making such as wood chip, fiber, mudstones.
Electron microscope is exactly the electron-optical arrangement being produced signal by the interaction of electronics and material.Compared with optical microscope, instead of optical lens with electromagnetic lens, by sightless for our naked eyes electron beam patterning on video screen.Therefore, its image is gray-scale map, and textural characteristics is obvious, and stereovision is high, and the depth of field is high.
In the present embodiment, electron microscope is adopted to obtain the rice paper digital picture of painting and calligraphy pieces genuine piece.
Step (2) carries out pre-service to rice paper digital picture, object is unessential information in removal of images, important information useful is really strengthened, and then strengthen the detectability of the information needed, view data is simplified to greatest extent, and makes feature extraction become simple.
Pre-service can adopt spatial domain process or frequency domain process.
Spatial domain process comprises cromogram and turns the methods such as gray-scale map, histogram equalization, morphological transformation and rim detection.
Frequency domain process comprises Gabor transformation.
Pre-service is carried out to rice paper digital picture, can be adopted one or more in said method.
The detection method of the painting and calligraphy pieces based on textural characteristics in the present embodiment, pre-service have employed histogram equalization and Gabor transformation.
Cromogram turns gray-scale map and transfers gray-scale map to for the colored rice digital picture of the painting and calligraphy pieces genuine piece by the acquisition by optical microscope.
International Commission on Illumination (CIE) selects red (wavelength 700.00nm), green (wavelength 546.1nm), blue (wavelength 438.8nm) these three kinds monochrome as representing three kinds of the most basic colors, any color can be represented by these three kinds of colors and forms, Here it is RGB Color Notation System.
Because human eye is the most responsive to green channel, thus the texture information that comprises of green channel is maximum, and compared with former figure, the contrast of green channel also increases, and the former figure of noise ratio will lack.Therefore, the color separated in the coloured image of optical microscope is gone out green channel by the present invention, represents the gray-scale value of gray level image by the value of green channel.
Histogram equalization is disposal route conventional in image processing field, for improving background and all too bright or all too dark picture quality of prospect.Principle is exactly utilize the contrast of the histogram of image to integral image to adjust.Can be understood as and the grey level histogram of original image is become being uniformly distributed in whole tonal range between some gray areas of comparing gathering, make the pixel quantity in certain tonal range probably the same.
In image gray scale be i pixel occur probability be:
Wherein n
irepresent the number of times that in gray level image, gray scale i occurs, L represents the highest pixel grey scale in image, and n represents pixel counts all in image.
Fig. 2, Fig. 3 be based on the histogram equalization of optical microscope image before and after image, Fig. 4, Fig. 5 be based on the histogram equalization of electron microscope image before and after image, therefrom can find out no matter be based on optical microscope image or based on electron microscope image, histogram equalization can make textural characteristics more obvious.
Mathematical morphology is the subject thought being based upon on set theory basis, in image procossing, have consequence, is therefore also the strong algorithm to geometric shape Epidemiological Analysis and description.The main research image aspects feature of mathematical morphology, the essential characteristic of analyzing and processing image and structure, namely element and element in analyzing and processing image, relation between image section and part.Morphological images process generally uses Image neighborhood computing, and use neighbour structure element, carry out certain logical operation in the neighbour structure element fields of each pixel, its result is exactly last image result.Corrosion and expansion are the most common morphological operations, are mainly used in the noise and the noise that reduce image.
If F (x) is structural element, for each pixel x in M in space, corrosion conversion can be defined as:
X={x:F(X)∈M}
Fig. 6 is that pending image M (is generally bianry image, for be stain), Fig. 7 is structural element F (x), if the central point of F (x) is the stain in the lower right corner, namely then corrode is that point by the central point of F (x) and M compares successively, if the point of the black on F (x) is all in the scope of M, then this point just remains, and does not just remove in scope; Fig. 8 is the result after corrosion, therefrom can find out, the result after corrosion is still within the scope of original M, but the point comprised than M is few, just as M has been corroded one deck.
If F (x) is structural element, for each some x in M in space, dilation transformation can be defined as:
The step of opening operation is that a width original image is first carried out erosion operation, then carries out dilation operation.After getting appropriate threshold, threshold binarization is carried out to image, obtain a bianry image.Carry out opening operation again, can result be obtained.
Fig. 9 carries out the image after image threshold based on optical microscope image shown in Fig. 3; Figure 10 carries out the image after opening operation based on optical microscope image shown in Fig. 9; Figure 11 carries out the image after image threshold based on electron microscope image shown in Fig. 5; Figure 12 carries out the image after opening operation based on electron microscope image shown in Figure 11.
Because image itself exists a lot of noise, very large impact is caused on opening operation, make result unsatisfactory.
Rim detection is the important method of image procossing.Because edge is that certain part colours of image changes the fastest region, be mostly in target and target, target and background and between region and region, be the important foundation of carrying out image procossing simultaneously.
The important attributive character of two of having in the edge of image is direction and amplitude, and edge direction is the vertical direction the most violent with edge pixel, and edge amplitude is then the degree that gray-scale pixels converts.In general, the pixel change of edge horizontal direction is mild, and the pixel perpendicular to edge direction then changes more violent.This change on edge can detect by differentiating operator, generally carrys out Edge detected by single order or second derivative.
Rim detection comprises Lapalace edge detection and canny rim detection.
Lapalace edge detection adopts the Laplace transform of two-dimensional function f (x, y) to be the differential of a second order, is defined as:
Wherein:
Like this, for a 3*3 window, can obtain:
f(x,y)=f(x+1,y)+f(x-1,y)+f(x,y+1)+f(x,y-1)-4f(x,y)
Above-mentioned formula is utilized to carry out convolution algorithm.
Figure 13 carries out the image after Lapalace edge detection based on optical microscope image shown in Fig. 9; Figure 14 carries out the image after Lapalace edge detection based on electron microscope image shown in Figure 11; As can be seen from the figure the result based on the image of electron microscope is better than based on optical microscope, but still there is a lot of noise, and effect is still not ideal enough.
Canny rim detection comprises the following steps:
A) denoising
Carry out smoothing image with 2-d gaussian filters G (x, y), reduce the impact of noise on image.It can be equivalent to vertical direction G (y) and these two one dimension Gaussian filters of horizontal direction G (x) carry out the long-pending of filtering result respectively.Image after level and smooth is:
H(x,y)=G(x,y)*I(x,y)=G(y)*(G(x)*I(x,y)
Wherein, I (x, y) represents the point (x, y) on original image, and H (x, y) represents the point (x, y) on filtered image.
B) brightness step in image is found
With finite difference formulations vertical direction derivative and the G (y) horizontal direction derivative G (x) of single order local derviation, generally Sobel or Prewitt operator can be adopted.Then, utilization orientation derivative carrys out compute gradient amplitude
C) non-maxima suppression
The position that non-maxima suppression computing carrys out accurately each marginal point is carried out to the Grad of each pixel.Such as, to 3 × 3 neighborhoods of current pixel point, if the gradient magnitude of this point is greater than the gradient magnitude along adjacent two pixels of gradient direction, then thinks that this point is marginal point to be selected, the point of its correspondence is labeled as 1.Otherwise, then think that this point is non-edge point, marked 0.
D) thresholding
Dual threashold value process is carried out to the image after non-maxima suppression process, except false edge, connects the edge broken.By the grey level histogram of artificially given high threshold and image, calculate final high threshold, then, by given Low threshold and image grey level histogram, calculate final Low threshold.Again by each candidate marginal of the image after non-maxima suppression and final high threshold with compare, write down the marginal point being less than final high threshold.To all marginal points, in 8 neighborhoods, find the point being greater than final Low threshold, be labeled as final marginal point.
Figure 15 carries out the image after canny rim detection based on optical microscope image shown in Fig. 9; Figure 16 carries out the image after canny rim detection based on electron microscope image shown in Figure 11; The result of the image therefrom can found out based on electron microscope is better than based on optical microscope, and continuous edge is clear, and our naked eyes just can visually see textural characteristics, secondly after process image and original image also without bigger difference.
Gabor transformation in frequency domain process is a kind of enlarge section.
Fourier transform is exactly that signal decomposition is become different frequency components, can resolve into multiple sinusoidal signal sum for any signal.Gabor function can extract the correlated characteristic of needs on different scale, different directions; and the expression in the frequency of the expression of human visual system in frequency and direction and Gabor filter and direction closely, so in texture processing with identify and usually have good effect.
Two-dimensional Gabor function has two kinds of representations.
The first form is:
Former formula:
Real part:
Imaginary part:
Wherein:
x′=xcosθ+ysinθ
y′=-xsinθ+ycosθ
represent phase offset,
represent the wavelength of sine function, θ represents kernel function direction, and σ represents that Gauss standard is poor, and γ represents x, the aspect ratio of y both direction.
The second form is:
Wherein:
The value of v represents the wavelength of Gabor filtering, and the value of u represents Gabor kernel function direction, and K represents total direction number, and parameter σ/k represents the size of Gauss's window, gets here
Form used herein is the second.Filter function and image are done convolution algorithm and can obtain process image.
Because Gabor function has very strong directivity, so we choose the Gabor transformation result figure of 0 °, 30 °, 60 °, 90 °, 120 ° and 150 ° six direction, as shown in Figure 17, Figure 18, Figure 17 carries out the filtered image of gabor based on optical microscope image shown in Fig. 9, and Figure 18 carries out the filtered image of gabor based on electron microscope image shown in Figure 11.
The result figure of Gabor filtering carries out thresholding process, obtains binary image, for filtering some image impurity, when merging the image result of six direction, is defined as follows screening conditions:
If f
ithe gray-scale value of Gabro changing image at point (x, y) of (x, y) to be kernel function direction be 30*i, this value can only be 0 or 255.
F (x, y) is the gray-scale value of image at point (x, y) of merging.Then
If
so F (x, y)=0, namely this point (x, y) is black.
If
so F (x, y)=255, namely this point (x, y) is white.
Can find out from Figure 17, Figure 18 no matter gabor filter process is have good effect for electron microscope or for optical microscope, but some pseudo-texture and noises can be produced, especially for the image based on optical microscope.
The code of gabor filtering is as follows:
Haar feature the earliest proposes (" A general framework for object detection ") by Papageorgiou C. etc., and Paul Viola and Michal Jones proposed to utilize integral image method to calculate the method (" Rapid object detection using a boosted cascade of simple features ") of Haar feature fast afterwards.Afterwards, Rainer Lienhart and Jochen Maydt has carried out expanding (" An extended set of Haar-like features for rapid object detection ") to Haar feature database with to corner characteristics.
As shown in figure 19, Haar feature comprises edge feature, linear feature, central feature and diagonal line feature, and these are combined into feature templates.Adularescent and black two kinds of rectangles in feature templates, and the eigenwert defining this template be white rectangle pixel and deduct black rectangle pixel and.
Choose edge feature the first two in the present embodiment as haarx and haary value, carry out texture feature extraction.
Also can choose other Haar features or Haar Feature Combination, carry out texture feature extraction.
Due to Haar feature be in rectangle all pixel values of black region and deduct all pixel values of white portion and.In image, the number of feature is far longer than its number of pixels, if calculate each feature pixel and, calculated amount can be very large, and many computings are repetitions.
Step (3) utilizes SURF algorithm, and texture feature extraction from pretreated rice paper digital picture, comprises the following steps:
A) integral image is utilized to calculate Haar feature
Paul Viola proposes a kind of method utilizing integral image method to calculate Haar feature fast, be exactly first construct " integrogram " (an Integral image), also Summed Area Table is, any one Haar rectangular characteristic can be obtained by the method (Look Up Table) of tabling look-up and limited number of time simple operation afterwards, greatly reduces operation times.
Integral image, refer to current pixel point position apart from initial point all gray scale sums in orthogonal region, initial point mentioned here refers to the pixel in the upper left corner of image.
The integral image of image I (x, y) (0≤x≤M, 0≤y≤N) can be formulated as:
The part that then can be understood as black by the method for image is current pixel point, and grey is integral domain, as shown in figure 20.
In computed image, gray scale sum S (x) of any one piece of rectangular area only needs to utilize the integrated value on rectangle 4 summits to obtain:
S(x)=S(X
1,Y
1)+S(X
2,Y
2)-S(X
2,Y
1)-S(X
1,Y
2)
Wherein S (X
1, Y
1), S (X
2, Y
2), S (X
2, Y
1) and S (X
1, Y
2) be respectively the integrated value on rectangle 4 summits.
B) construct Hessian matrix, obtain extreme point
Hessian matrix is the most important part of whole Surf algorithm, and suppose that function f (x, y) is binary continuously differentiable function, then Hessian matrix is:
So each pixel can have individual Hessian matrix, to Matrix Calculating discriminant is:
If detH>0, function f (x, y) obtains extreme point at (x, y) point.
C) adopt gaussian filtering have found basis for estimation that whether point (x, y) is Local Extremum
Because unique point need possess yardstick independence, so before structure Hessian matrix, need to carry out gaussian filtering process to image, be formed in the expression under different scale.The image function then giving gaussian filtering is:
L((x,y),t)=G(t)·I(x,y)
L ((x, y), t) is the expression of image under different scale, and gaussian kernel G (t) and image function I (x, y) are carried out convolution algorithm at point (x, y), and wherein gaussian kernel G (t) is:
Wherein g (x) represents Gaussian function, and t represents Gauss's variance.Make the signals that just can calculate the H determinant into pixel each in image in this way, then utilize the value drawn to carry out judging characteristic point.Herbert Bay proposed L (x, t) to be replaced conveniently to apply by approximate value afterwards.In order to H matrix discriminant is expressed as by the error reducing correct value and approximate value:
Figure 21 is the figure obtained after gaussian filtering, y-axis direction is asked second derivative template, again in order to arithmetic speed accelerate to employ approximate processing, its process after figure as limit, the right figure shown in, this, simplifies many.And right figure can adopt integrogram to carry out computing, so just more accelerate speed.Equally, x and y direction second order mixing local derviation template is then as shown in figure 22:
So just have found the basis for estimation whether point (x, y) is Local Extremum.Convolution process for different dimensions can represent with a pyramid model, a yardstick of pyramidal every one deck representative image, can be represented the expression under different scale by the convolution of gaussian kernel and original image.Surf algorithm all keeps original image constant for every one deck and only changes the size of wave filter.
D) extract minutiae
Figure 23, Figure 24 show yardstick pyramid, after obtaining the Local Extremum of image on different scale, as shown in figure 25,18 points of 8 points adjacent in 26 points in itself and pyramidal field of three dimension and same metric space and upper and lower two adjacent yardsticks are compared, if this point is extreme point, then remain as preliminary unique point.
After obtaining whole preliminary unique point, obtain other unique point of sub-pixel by linear difference approach, then remove the point being less than setting threshold value, obtain final unique point.
E) characteristic direction is obtained
In SURF coupling, obtain proper vector by the wavelet character calculating unique point field.If S is the yardstick at unique point place, take generally unique point as the center of circle, 6S is radius, sets up territory, garden, calculates the harr wavelet character of these points.
In SURF algorithm, as shown in Figure 26 a, Figure 26 b and Figure 26 c, with the territory, interval scan whole garden of 60 degree of sector regions and setting, the maximal value of the little wave response of harr is 4s, weight is given according to the distance in the distance center of circle, the distance center of circle is far away, and weight is less, then by the Harr small echo response quautity vector addition of point each in sector region.Eachly so fan-shapedly will obtain a vector value, finally find the sector region that vector value is maximum, the direction vector of this sector region, as the characteristic direction of this unique point.Repeat said method, obtain the characteristic direction of all unique points.
F) construct SURF unique point and describe operator
Around unique point, get the square area of 20s*20s, and with the unique point direction of this unique point, this Region dividing is become 16 sub regions of 5*5, as shown in figure 27, utilize haar feature can calculate the direction of each unique point.
The code calculating proper vector is as follows:
SiftFeatureDetector detector; // constructed fuction adopts internal default
Std::vector<KeyPoint>keypoint s_1, keypoints_2; // structure 2 is used for storing unique point by a some vector formed specially
Detector.detect (img_1, keypoints_1); // unique point detected in img_1 image stored be placed in keypoints_1
Detector.detect (img_2, keypoints_2); // in like manner
// draw unique point in the picture
Mat?img_keypoints_1,img_keypoints_2;
DrawKeypoints (img_1, keypoints_1, img_keypoints_1, Scalar::all (-1), DrawMatchesFlags::DEFAULT); // in internal memory, draw unique point
drawKeypoints(img_2,keypoints_2,img_keypoints_2,Scalar::all(-1),DrawMatchesFlags::DEFAULT);
Imshow (" sift_keypoints_1 ", img_keypoints_1); // indicating characteristic point
imshow("sift_keypoints_2",img_keypoints_2);
// calculate proper vector
SiftDescriptorExtractor extractor; // definition descriptor object
Mat descriptors_1, descriptors_2; The matrix of // storage feature vector
Extractor.compute (img_1, keypoints_1, descriptors_1); // calculate proper vector
extractor.compute(img_2,keypoints_2,descriptors_2);
Step (4) carries out textural characteristics coupling to painting and calligraphy pieces to be detected,
The method of Feature Points Matching has a lot, method (K Nearest Neighbors as contiguous in K, KNN), the method of exhaustion (Brute Force, BF) and fast contiguous method (Fast Approximate Nearest Neighbors is similar to, etc. Flann), due to Flann algorithm be applicable to high latitude data and efficiency higher than the two kinds of algorithms mentioned before, so select Flann algorithm herein.
But Flann matching algorithm has directivity between target image and candidate image, to obtain result not identical with removing coupling target image with candidate image namely to remove matching candidate image with target image.Therefore adopt two-way Flann to mate herein and reduce this error.
If image M
1in a unique point a
1, found at image M by Flann algorithm
2in there is the initial match point (a of minor increment
1, a
2), then according to the minor increment dist of all coupling centerings
min, threshold value DIST=h*dist is set
min.If the minor increment of match point is less than dist
min, then by image M
2in a
2as image M
1a
1candidate matches point, otherwise reject a
1, and carry out image M
1the coupling of next one point, finally obtain image M
1for M
2matching double points.
Same, according to said method, obtain image M
2for M
1matching double points.If image M
1for M
2certain matching double points be (a
1, a
2), image M
2mid point a
2for M
2certain matching double points be (a
3, a
2), if a
3=a
1, then think that the match is successful, otherwise reject.So just obtain final match point.Figure 28, Figure 29 are Feature Points Matching result figure.Figure 28 cathetus represents and is connected by two match points, because first group of rice paper image only has translation not rotate, so the coupling seen is all parallel straight line.And second group has rotation, in Figure 29, the straight line of left and right coupling can not be parallel.
Although two-way Flann coupling makes the accuracy of the coupling of unique point greatly improve, but we can not guarantee unique point, and coupling is correct, and target image itself may forgive nonsensical unique point, namely noise, these points are not the effective description for our interesting target.
Based on the problems referred to above, can think it is only that the coupling of unique point cannot be guaranteed to provide an Output rusults accurately to painting and calligraphy images match.Here we consider to adopt to utilize feature points clustering to solve this problem to extract same area.
In clustering algorithm, K-Means a kind ofly has representativeness and algorithm the most conventional, and it is that n object is divided into individual bunch of k, makes the individual object similarity sum in all bunches the highest, here similarity with bunch in point represent to the average of centre distance.Algorithm starts to randomly draw K object, suppose that each object is the center of bunch, itself and k centre distance are judged to all objects, added nearest bunch, then upgrade average and the center of each bunch, repeat above-mentioned steps until the square error summation met in each bunch is minimum.
But original K-Means algorithm when bunch between difference larger just can obtain reasonable Clustering Effect, and very sensitive for isolated point, be easily interfered.So the present embodiment adopts is the K Mean Method (K-WMeans) of Weight.Whole computation process is as follows:
1. any K point is chosen, as original bunch center, for certain bunch of S
jif, x
ifor its center, definition d (x
i, x
k) be x
ito putting x in addition arbitrarily
kdistance, then can define certain point weighted value be:
Wherein
2. for bunch in point, calculate its weighted mean value, and by these point again give the immediate race of weights, for certain bunch of S
j, weight computing formula is:
3. according to the weights of the appeal formulae discovery heart bunch
4. repeated execution of steps 2 and 3, until each crisp sample is stablized, and the square error summation met in each bunch is minimum.
The detection method of the painting and calligraphy pieces based on textural characteristics in the present embodiment, by poly-for unique point four classes, the effect after cluster as shown in figure 30.
Extract the central point of each bunch after cluster, centered by point, extract the square area of fixed size.Wherein the region of target image is less than candidate image, and this is that the cluster centre caused in order to Feature Points Matching error exists offset error, and poly-four intercepting subregion, Lei Bingyilei center effects as shown in figure 31.
Now, the target area that candidate image extracts by we, carries out affined transformation in conjunction with the central point after cluster, the error of rotation and translation is corrected.Again expansive working is carried out in candidate image, this is also to reduce error.If target area image size is M*M, candidate region image size is N*N, N>M, and centered by point (i, j), putting territory in candidate region, intercepting the square region identical with target area is H
i, j, wherein 0≤i≤N-M, 0≤j≤N-M, then H
i,jwith the similarity S (i, j) of target area be:
Wherein L
m(n, m) and H
i,j(n, m) represents the gray-scale value of the point (n, m) in target area and intercepting region respectively.We the likelihood S of defined range is:
Finally we choose similarity in all regions the highest be the similarity of two width images.
The similarity of first group of painting and calligraphy pieces is:
Matching rate 1
Matching rate 0.995885
Matching rate 0.970803
Matching rate 1
The similarity of second group of painting and calligraphy pieces is:
Matching rate 0.728111
Matching rate 0.928571
Matching rate 0.871508
Matching rate 0.769504
First group of data is the similarities extracting region in first group of image with each class Chinese and Western, and wherein maximum matching rate is 1, and so just can think that two width painting and calligraphy are the same, so these width painting and calligraphy are exactly authentic work.Second group of data is to extract the similarity in region in each class in second group of image, owing to there is error when rotation and pre-service, it is not high especially for causing not being the matching rate in each region, but still have the matching rate in a region to reach 92.8%, so also can think that these width painting and calligraphy are authentic works.
The code of Region Matching is as follows:
IplImage*pRightImage11=cvCloneImage(my_rio1);
cvSetImageROI(pRightImage11,Rect(maxx,maxy,rect_wide,rect_wide));
cvCopy(pRightImage11,matched,0);
IplImage*matchmodel=andoperate(matched,my_rio);
Cout<< " matching rate " <<maxvalue<LEssT. LTssT.LTendl;
The detection method of the painting and calligraphy pieces based on textural characteristics provided by the invention, (1) only a feature extraction need be carried out to painting and calligraphy pieces authentic work, in the testing process to painting and calligraphy pieces to be detected, utilize the feature extracted from painting and calligraphy pieces authentic work, adopt two-way FLANN algorithm and combine and again feature points clustering is obtained Mei Lei center, be that scope intercepts square region with center, judge the true or false of whole painting and calligraphy with the similarity of the highest square region, the true and false of painting and calligraphy pieces can be distinguished quickly and accurately; Not needing people for adding material beyond painting and calligraphy, carrying out the true or false of auxiliary judgment painting and calligraphy pieces, because the rice paper textural characteristics of painting and calligraphy pieces authentic work almost cannot copy, thus can prevent lawless person from utilizing high-tech to forge painting and calligraphy pieces.
More than describe preferred embodiment of the present invention in detail.Should be appreciated that those of ordinary skill in the art just design according to the present invention can make many modifications and variations without the need to creative work.Therefore, all technician in the art, all should by the determined protection domain of claims under this invention's idea on the basis of existing technology by the available technical scheme of logical analysis, reasoning, or a limited experiment.
Claims (7)
1., based on a painting and calligraphy pieces detection method for textural characteristics, it is characterized in that, comprise the following steps:
(1) the rice paper digital picture of painting and calligraphy pieces authentic work is obtained;
(2) pre-service is carried out to rice paper digital picture, obtain pretreated rice paper digital picture;
(3) SURF algorithm is utilized, texture feature extraction from pretreated rice paper digital picture;
(4) carry out textural characteristics coupling to painting and calligraphy pieces to be detected, if coupling, then painting and calligraphy pieces to be detected is described painting and calligraphy pieces authentic work; If do not mated, then painting and calligraphy pieces to be detected is fakement.
2. as claimed in claim 1 based on the painting and calligraphy pieces detection method of textural characteristics, it is characterized in that, in step (4), adopt two-way Flann matching algorithm to carry out textural characteristics coupling to painting and calligraphy pieces to be detected.
3., as claimed in claim 2 based on the painting and calligraphy pieces detection method of textural characteristics, it is characterized in that, further comprising the steps of before carrying out textural characteristics coupling in step (4):
(41) utilize improvement K-Means algorithm to carry out cluster, obtain the center of every class;
(42) by the center of the class of acquisition in step (41), centered by the center of class, intercept square region, judge the true or false of painting and calligraphy pieces to be detected with the similarity of the highest square region.
4. as claimed in claim 1 based on the painting and calligraphy pieces detection method of textural characteristics, it is characterized in that, step (2) to rice paper digital picture carry out pre-service adopt cromogram turn gray-scale map, histogram equalization, morphological transformation, rim detection or Gabor filter in the combination of one or more.
5., as claimed in claim 1 based on the painting and calligraphy pieces detection method of textural characteristics, it is characterized in that, step (3) utilizes SURF algorithm, and texture feature extraction from pretreated rice paper digital picture, comprises the following steps:
(31) integral image is utilized to calculate Haar feature;
(32) construct Hessian matrix, obtain extreme point;
(33) adopt gaussian filtering have found basis for estimation that whether point (x, y) is Local Extremum;
(34) extract minutiae;
(35) unique point characteristic direction is obtained;
(36) construct SURF unique point and describe operator.
6. as claimed in claim 5 based on the painting and calligraphy pieces detection method of textural characteristics, it is characterized in that, step (35) obtains characteristic direction and comprises the following steps:
(351) take unique point as the center of circle, with six of the yardstick at unique point place times for radius, set up circle region;
(352) region is justified with the interval scan of 60 degree of sector regions and setting;
(353) maximal value of the little wave response of harr is four times of the yardstick at unique point place, and give weight according to the distance in the distance center of circle, the distance center of circle is far away, and weight is less;
(354) the Harr small echo response vector of point each in sector region is added;
(355) find Harr small echo response vector and maximum sector region, the direction vector of sector region is the characteristic direction of unique point.
7., as claimed in claim 5 based on the painting and calligraphy pieces detection method of textural characteristics, it is characterized in that, the method for step (34) extract minutiae comprises the following steps:
(341) Local Extremum and 18 points being arranged in 8 adjacent points of same yardstick and upper and lower two adjacent yardsticks are compared, if Local Extremum is extreme point, then Local Extremum is preliminary extreme point;
(342) after obtaining whole preliminary unique point, obtain other unique point of sub-pixel by linear difference approach, then remove the point being less than setting threshold value, obtain final unique point.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410428074.5A CN104217221A (en) | 2014-08-27 | 2014-08-27 | Method for detecting calligraphy and paintings based on textural features |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410428074.5A CN104217221A (en) | 2014-08-27 | 2014-08-27 | Method for detecting calligraphy and paintings based on textural features |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104217221A true CN104217221A (en) | 2014-12-17 |
Family
ID=52098684
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410428074.5A Pending CN104217221A (en) | 2014-08-27 | 2014-08-27 | Method for detecting calligraphy and paintings based on textural features |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104217221A (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104636733A (en) * | 2015-02-12 | 2015-05-20 | 湖北华中文化产权交易所有限公司 | Image characteristic-based painting and calligraphy work authenticating method |
CN105243384A (en) * | 2015-09-17 | 2016-01-13 | 上海大学 | Pattern recognition-based cultural relic and artwork uniqueness identification method |
CN105389557A (en) * | 2015-11-10 | 2016-03-09 | 佛山科学技术学院 | Electronic official document classification method based on multi-region features |
CN105426884A (en) * | 2015-11-10 | 2016-03-23 | 佛山科学技术学院 | Fast document type recognition method based on full-sized feature extraction |
CN106340011A (en) * | 2016-08-23 | 2017-01-18 | 天津光电高斯通信工程技术股份有限公司 | Automatic detection and identification method for railway wagon door opening |
CN106991419A (en) * | 2017-03-13 | 2017-07-28 | 特维轮网络科技(杭州)有限公司 | Method for anti-counterfeit based on tire inner wall random grain |
CN107004263A (en) * | 2014-12-31 | 2017-08-01 | 朴相来 | Image analysis method, device and computer readable device |
CN107563427A (en) * | 2016-08-25 | 2018-01-09 | 维纳·肖尔岑 | The method and corresponding use that copyright for oil painting is identified |
CN108734176A (en) * | 2018-05-07 | 2018-11-02 | 南京信息工程大学 | Certificate true-false detection method based on texture |
CN108846681A (en) * | 2018-05-30 | 2018-11-20 | 于东升 | For the method for anti-counterfeit and device of woodwork, anti-fake traceability system |
CN109271839A (en) * | 2018-07-23 | 2019-01-25 | 广东数相智能科技有限公司 | A kind of books defect detection method, system and storage medium |
CN109543757A (en) * | 2018-11-27 | 2019-03-29 | 陕西文投艺术品光谱科技有限公司 | A kind of painting and calligraphy painting style identification method based on spectral imaging technology and atlas analysis |
CN110111387A (en) * | 2019-04-19 | 2019-08-09 | 南京大学 | A kind of pointer gauge positioning and reading algorithm based on dial plate feature |
CN110347855A (en) * | 2019-07-17 | 2019-10-18 | 京东方科技集团股份有限公司 | Paintings recommended method, terminal device, server, computer equipment and medium |
CN110599665A (en) * | 2018-06-13 | 2019-12-20 | 深圳兆日科技股份有限公司 | Paper pattern recognition method and device, computer equipment and storage medium |
CN111002348A (en) * | 2019-12-25 | 2020-04-14 | 深圳前海达闼云端智能科技有限公司 | Robot performance testing method, robot and computer readable storage medium |
CN111242993A (en) * | 2020-01-08 | 2020-06-05 | 暨南大学 | Method for identifying authenticity of article based on substrate texture image and appearance characteristic image |
CN111709363A (en) * | 2020-06-16 | 2020-09-25 | 湘潭大学 | Chinese painting authenticity identification method based on rice paper grain feature identification |
CN112818730A (en) * | 2020-03-05 | 2021-05-18 | 刘惠敏 | Cloud storage type online signature identification system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130080426A1 (en) * | 2011-09-26 | 2013-03-28 | Xue-wen Chen | System and methods of integrating visual features and textual features for image searching |
CN103106265A (en) * | 2013-01-30 | 2013-05-15 | 北京工商大学 | Method and system of classifying similar images |
CN103440668A (en) * | 2013-08-30 | 2013-12-11 | 中国科学院信息工程研究所 | Method and device for tracing online video target |
CN103714349A (en) * | 2014-01-09 | 2014-04-09 | 成都淞幸科技有限责任公司 | Image recognition method based on color and texture features |
-
2014
- 2014-08-27 CN CN201410428074.5A patent/CN104217221A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130080426A1 (en) * | 2011-09-26 | 2013-03-28 | Xue-wen Chen | System and methods of integrating visual features and textual features for image searching |
CN103106265A (en) * | 2013-01-30 | 2013-05-15 | 北京工商大学 | Method and system of classifying similar images |
CN103440668A (en) * | 2013-08-30 | 2013-12-11 | 中国科学院信息工程研究所 | Method and device for tracing online video target |
CN103714349A (en) * | 2014-01-09 | 2014-04-09 | 成都淞幸科技有限责任公司 | Image recognition method based on color and texture features |
Non-Patent Citations (1)
Title |
---|
张万全: "基于区域SURF的图像匹配算法研究", 《中国优秀硕士论文全文数据库》 * |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107004263A (en) * | 2014-12-31 | 2017-08-01 | 朴相来 | Image analysis method, device and computer readable device |
CN107004263B (en) * | 2014-12-31 | 2021-04-09 | 朴相来 | Image analysis method and device and computer readable device |
CN104636733A (en) * | 2015-02-12 | 2015-05-20 | 湖北华中文化产权交易所有限公司 | Image characteristic-based painting and calligraphy work authenticating method |
CN105243384A (en) * | 2015-09-17 | 2016-01-13 | 上海大学 | Pattern recognition-based cultural relic and artwork uniqueness identification method |
CN105389557A (en) * | 2015-11-10 | 2016-03-09 | 佛山科学技术学院 | Electronic official document classification method based on multi-region features |
CN105426884A (en) * | 2015-11-10 | 2016-03-23 | 佛山科学技术学院 | Fast document type recognition method based on full-sized feature extraction |
CN106340011A (en) * | 2016-08-23 | 2017-01-18 | 天津光电高斯通信工程技术股份有限公司 | Automatic detection and identification method for railway wagon door opening |
CN106340011B (en) * | 2016-08-23 | 2019-01-18 | 天津光电高斯通信工程技术股份有限公司 | A kind of automatic detection recognition method that lorry door is opened |
CN107563427A (en) * | 2016-08-25 | 2018-01-09 | 维纳·肖尔岑 | The method and corresponding use that copyright for oil painting is identified |
CN106991419A (en) * | 2017-03-13 | 2017-07-28 | 特维轮网络科技(杭州)有限公司 | Method for anti-counterfeit based on tire inner wall random grain |
CN108734176A (en) * | 2018-05-07 | 2018-11-02 | 南京信息工程大学 | Certificate true-false detection method based on texture |
CN108734176B (en) * | 2018-05-07 | 2021-11-12 | 南京信息工程大学 | Certificate authenticity detection method based on texture |
CN108846681A (en) * | 2018-05-30 | 2018-11-20 | 于东升 | For the method for anti-counterfeit and device of woodwork, anti-fake traceability system |
CN110599665A (en) * | 2018-06-13 | 2019-12-20 | 深圳兆日科技股份有限公司 | Paper pattern recognition method and device, computer equipment and storage medium |
CN109271839A (en) * | 2018-07-23 | 2019-01-25 | 广东数相智能科技有限公司 | A kind of books defect detection method, system and storage medium |
CN109271839B (en) * | 2018-07-23 | 2022-11-01 | 广东数相智能科技有限公司 | Book defect detection method, system and storage medium |
CN109543757A (en) * | 2018-11-27 | 2019-03-29 | 陕西文投艺术品光谱科技有限公司 | A kind of painting and calligraphy painting style identification method based on spectral imaging technology and atlas analysis |
CN110111387B (en) * | 2019-04-19 | 2021-07-27 | 南京大学 | Dial plate characteristic-based pointer meter positioning and reading method |
CN110111387A (en) * | 2019-04-19 | 2019-08-09 | 南京大学 | A kind of pointer gauge positioning and reading algorithm based on dial plate feature |
CN110347855A (en) * | 2019-07-17 | 2019-10-18 | 京东方科技集团股份有限公司 | Paintings recommended method, terminal device, server, computer equipment and medium |
US11341735B2 (en) | 2019-07-17 | 2022-05-24 | Boe Technology Group Co., Ltd. | Image recommendation method, client, server, computer system and medium |
CN111002348A (en) * | 2019-12-25 | 2020-04-14 | 深圳前海达闼云端智能科技有限公司 | Robot performance testing method, robot and computer readable storage medium |
CN111242993A (en) * | 2020-01-08 | 2020-06-05 | 暨南大学 | Method for identifying authenticity of article based on substrate texture image and appearance characteristic image |
CN111242993B (en) * | 2020-01-08 | 2022-04-26 | 暨南大学 | Method for identifying authenticity of article based on substrate texture image and appearance characteristic image |
CN112818730A (en) * | 2020-03-05 | 2021-05-18 | 刘惠敏 | Cloud storage type online signature identification system |
CN111709363A (en) * | 2020-06-16 | 2020-09-25 | 湘潭大学 | Chinese painting authenticity identification method based on rice paper grain feature identification |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104217221A (en) | Method for detecting calligraphy and paintings based on textural features | |
Gao et al. | Automatic change detection in synthetic aperture radar images based on PCANet | |
CN107610114B (en) | optical satellite remote sensing image cloud and snow fog detection method based on support vector machine | |
Zhang et al. | Contact lens detection based on weighted LBP | |
CN110443128B (en) | Finger vein identification method based on SURF feature point accurate matching | |
Sirmacek et al. | Urban-area and building detection using SIFT keypoints and graph theory | |
CN102426649B (en) | Simple steel seal digital automatic identification method with high accuracy rate | |
Li et al. | A spatial clustering method with edge weighting for image segmentation | |
CN104835175B (en) | Object detection method in a kind of nuclear environment of view-based access control model attention mechanism | |
CN110298376B (en) | Bank bill image classification method based on improved B-CNN | |
CN109919960B (en) | Image continuous edge detection method based on multi-scale Gabor filter | |
CN107844736A (en) | iris locating method and device | |
CN105701495B (en) | Image texture feature extraction method | |
CN107066972B (en) | Natural scene Method for text detection based on multichannel extremal region | |
CN103034838A (en) | Special vehicle instrument type identification and calibration method based on image characteristics | |
CN102306289A (en) | Method for extracting iris features based on pulse couple neural network (PCNN) | |
CN113392856B (en) | Image forgery detection device and method | |
CN109978848A (en) | Method based on hard exudate in multiple light courcess color constancy model inspection eye fundus image | |
Pamplona Segundo et al. | Pore-based ridge reconstruction for fingerprint recognition | |
CN111259756A (en) | Pedestrian re-identification method based on local high-frequency features and mixed metric learning | |
Li et al. | SDBD: A hierarchical region-of-interest detection approach in large-scale remote sensing image | |
CN115311746A (en) | Off-line signature authenticity detection method based on multi-feature fusion | |
Diaz-Escobar et al. | Natural Scene Text Detection and Segmentation Using Phase‐Based Regions and Character Retrieval | |
Anjomshoae et al. | Enhancement of template-based method for overlapping rubber tree leaf identification | |
Ahmed et al. | Retina based biometric authentication using phase congruency |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20141217 |