CN108509870A - A kind of Eriocheir sinensis uniqueness recognition methods based on images match - Google Patents

A kind of Eriocheir sinensis uniqueness recognition methods based on images match Download PDF

Info

Publication number
CN108509870A
CN108509870A CN201810207047.3A CN201810207047A CN108509870A CN 108509870 A CN108509870 A CN 108509870A CN 201810207047 A CN201810207047 A CN 201810207047A CN 108509870 A CN108509870 A CN 108509870A
Authority
CN
China
Prior art keywords
image
eriocheir sinensis
characteristic point
point
match
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810207047.3A
Other languages
Chinese (zh)
Other versions
CN108509870B (en
Inventor
邰伟鹏
李�浩
汪杰
张炳良
王小林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Engineering Information Technology Co Ltd
Original Assignee
Anhui Engineering Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Engineering Information Technology Co Ltd filed Critical Anhui Engineering Information Technology Co Ltd
Priority to CN201810207047.3A priority Critical patent/CN108509870B/en
Publication of CN108509870A publication Critical patent/CN108509870A/en
Application granted granted Critical
Publication of CN108509870B publication Critical patent/CN108509870B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/446Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering using Haar-like filters, e.g. using integral image techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The Eriocheir sinensis uniqueness recognition methods based on images match that the invention discloses a kind of, belongs to digital image processing field.The present invention includes the following steps:Step 1:Eriocheir sinensis original image A is acquired, the back image M of Eriocheir sinensis is then divided;Step 2:Extract the characteristic point of Eriocheir sinensis back image M;Step 3:The Eriocheir sinensis back image Q preserved and characteristic point are extracted from database, match the characteristic point of image Q and image M;Step 4:Detect error hiding characteristic point;Step 5:Calculate the similarity of image Q and image M;Step 6:Matching terminates, and exports crab match information.Present invention aims to overcome that the problem of Eriocheir sinensis uniqueness can not be efficiently identified in the existing method of inspection, by using widely distributed various recess, protrusion and Texture eigenvalue in crab shell, using image analysis processing, then it is compared with the image data base deposited, the uniqueness of Eriocheir sinensis is judged according to similarity.

Description

A kind of Eriocheir sinensis uniqueness recognition methods based on images match
Technical field
The invention belongs to technical field of image processing, are related to a kind of image matching algorithm, more particularly to a kind of to be based on image Matched Eriocheir sinensis uniqueness recognition methods.
Background technology
Eriocheir japonica sinensis is also known as river crab, steamed crab or crab, delicious flavour, full of nutrition, is the traditional rare water in China One of product.In recent years, very burning hot, the ground using Yangcheng Lake steamed crab as Eriocheir japonica sinensis consumption market every autumn and winter of representative Reason sign protection product or the trade mark named with waters increasingly emerge in large numbers.But being driven by interests makes, in the market fake products remain incessant after repeated prohibition, Especially palm off place of production phenomenon especially severe.Product is counterfeited in order to prevent, and various regions take a variety of anti-counterfeit measures.Existing market Upper common method for anti-counterfeit is to adhere to anti-counterfeiting mark on steamed crab, then passes through the website of announcement, telephone number or short message number Code come verify mark in number (sequence number) or anti-fake code.The method of attachment has encrypts at random on crab shell laser incising Code, or put on anti-fake crab button etc. in crab leg.Common method for anti-counterfeit has following defect in the market:
1. logical separation:What anti-counterfeiting mark was logically separate with steamed crab, that is, the object verified is not steamed crab sheet Body, and it is attached to the anti-counterfeiting mark on steamed crab.
2. conclusion is fuzzy:No matter phone, short message or network inquiry are used, the result of inquiry, which can only be anti-counterfeiting mark, to be had Nothing was verified, and cannot be obtained true and false.Opposite vacation code may be accredited as very when being verified for the 1st time, and true code is at the 2nd time Vacation is accredited as when verification instead.
3. being easy to replicate:The structure of anti-counterfeiting mark is easy counterfeit, it might even be possible to accomplish 1:The duplication of N.
4. cannot trace to the source:The sequence number of anti-counterfeiting mark extensive management, anti-counterfeiting mark is not associated with production information suggestion, because This cannot track and trace to the source, and sequence number does not have specific life cycle.
For now, the anti-fake investigative technique of crab is some as follows, and it is following also to the principle of its relevant art and its Advantage and disadvantage are made that specific introduction.
1. being carried out using antiforge button anti-fake
Fake certification code is printed on antiforge button, consumer can be by dialing inquiry phone, sending short message or logging in crab The anti-fake examination official website input fake certification code of crab is inquired.
Advantage:Counterfeit protection measures are relatively convenient, and the cost of antiforge button is relatively low.
Disadvantage:There is antiforge button, is not necessarily the steamed crab of orthodox school.Antiforge button is easy to be copied, and the true and false of antiforge button is difficult to Identification.False crab may be caused to be certified into authentic crab.And authentic crab leads to certification not due to false crab predictive authentication Success.
2. being carried out using two-dimension code label anti-fake
By scanning the two-dimensional code the Quick Response Code on label, judge whether crab is authentic Yangcheng Lake or other well-known product Board crab.
Advantage:In today that electric business is carried out rapidly, as Quick Response Code now most prevailing and wechat path, many companies are all With from company's masses' account and company's Quick Response Code, national 315 product false proof Help Centers also thoroughly by Quick Response Code with It is anti-fake perfectly to link together.Anti-fake service not only can be efficiently provided using two-dimension code label, moreover it is possible to help company Publicity and implementation.
Disadvantage:The shortcomings that with antiforge button, is the same, and be easy to cause two-dimension code label is not necessarily authentic steamed crab.
3. laser inscription is anti-fake
Laser branding is stamped at crab back using laser stamp, may include that house mark, net enclose number, product sequence Number, supervise unit and specialize in the contents such as area.
Advantage:Laser inscription mark depth as shallow, does not hinder crab, does not influence survival rate, and the font of mark is clearly easily distinguished.And Nuisanceless, at home and abroad market has higher reputational degree.
Disadvantage:Laser inscription is also easier to be counterfeited, if wicked business in the imprinting of common crab back with orthodox school Steamed crab the same identifying code or product serial number, consumer still cannot be distinguished.
Through retrieval, has related patents scheme and disclose.Such as Chinese patent:Application number:201410068984.7 proprietary term Claim:A kind of Antiforge recognizing method of coloured image, publication date:2014.05.07, which disclose a kind of the anti-of coloured image Pseudo- recognition methods, includes the following steps:Raw color image data acquires;Analyzing processing simultaneously stores;Coloured image number to be identified According to acquisition;Analyzing processing simultaneously compares.It is anti-fake to improve coloured image by color password for Antiforge recognizing method provided by the invention Rank, other factories are not under the premise of obtaining primitive color password, it is difficult to copy out the new coloured silk consistent with original color image Color image copies original color image to efficiently avoid other producers, greatly protects original color figure The interests of the manufacturer of picture.And the method that consumer can also provide through the invention efficiently identifies out coloured image The true and false.But disadvantage is that the patent is not suitable for the detection of the biological uniqueness such as crab, it is unable to reach Eriocheir sinensis Anti-counterfeit recognition.
Invention content
1. technical problems to be solved by the inivention
Aiming to overcome that for this patent can not efficiently identify asking for Eriocheir sinensis uniqueness in the existing method of inspection Topic, provides a kind of Eriocheir sinensis uniqueness recognition methods based on images match, by using widely distributed in crab shell Then various recess, protrusion and Texture eigenvalue are compared, root using image analysis processing with the image data base deposited The uniqueness of Eriocheir sinensis is judged according to similarity.
2. technical solution
In order to achieve the above objectives, technical solution provided by the invention is:
A kind of Eriocheir sinensis uniqueness recognition methods based on images match of the present invention, includes the following steps:
Step 1:Eriocheir sinensis original image A is acquired, the back image M of Eriocheir sinensis is then partitioned into;
Step 2:Extract the characteristic point of Eriocheir sinensis back image M;
Step 3:Extract the Eriocheir sinensis back image Q preserved and characteristic point from database, matching image Q and The characteristic point of image M;
Step 4:Detect error hiding characteristic point;
Step 5:Calculate the similarity of image Q and image M;
Step 6:Matching terminates, and exports crab match information.
As further improvement of the present invention, the specific of the back image M of Eriocheir sinensis is divided in the step 1 Steps are as follows:
Step (1):Background image is removed, obtains Eriocheir sinensis background graphics segmentation figure 5.;
Step (2):5. Eriocheir sinensis background graphics segmentation figure is handled, obtain back graphics expansion figure 7.;
Step (3):The back image M of Eriocheir sinensis finally can be obtained according to the marginal information of Eriocheir sinensis.
As further improvement of the present invention, the step (1) is as follows:
Step I:Eriocheir sinensis back original image A is shot, the shooting tool is the camera terminal for having camera;
Step II:1. collected Eriocheir sinensis original image A is switched into gray-scale map;
Step III:1. noise reduction process is carried out to gray-scale map using gaussian filtering, obtains gaussian filtering figure 2.;
Step IV:The edge of gaussian filtering figure 2. is detected using sobel operators, obtains sobel operator detections figure 3.;
Step V:Using Threshold segmentation 3. sobel operator detection figures are subjected to binaryzation, obtain Threshold segmentation figure 4.;
Step VI:Gray inversion is 4. carried out to Threshold segmentation figure obtains mask figure m1, then 4. Threshold segmentation figure is done expanded, Holes filling and erosion operation obtain mask figure m2;
Step VII:Two figures of m1 and m2 are shipped into calculation, obtain Eriocheir sinensis background graphics segmentation figure 5..
As further improvement of the present invention, the specific processing step of the step (2) is as follows:
Step I:First corrosion Eriocheir sinensis background graphics segmentation figure 5. then to the background graphics segmentation figure after corrosion 5. Empty filling is carried out, obtains image 6.;
Step II:Dorsal area is filtered out according to the size of image 6. connected domain, obtains mask figure m3;
Step III:Mask figure m3 is expanded, keeps it identical as the back feature size in Eriocheir sinensis original image A, obtains Back graphics expansion figure is 7..
As further improvement of the present invention, the step (3) is specially:By back graphics expansion figure 7. with original image A ships calculation, you can obtains the back image M of Eriocheir sinensis.
As further improvement of the present invention, the characteristic point of Eriocheir sinensis back image M is extracted in the step 2 Used method is to accelerate robust feature algorithm, and this method includes extracting characteristic point and generating vectorial two mistakes of feature description Journey.
As further improvement of the present invention, the extraction characteristic point specifically includes following steps:
Step I:By integral image, the integral image is defined as follows:
If X=(x, y) indicates a certain pixel of image I (X), then integral image IΣ(X) indicate with point X=(x, y) and Image origin is the sum of the pixel being formed by angular vertex in rectangular area, is formulated as follows:
Calculate integral image when, it is only necessary to traverse an original image, therefore calculation amount very little, if rectangular area by vertex A, B, 4 points of compositions of C and D, then the sum of gray value of rectangular area is ∑=A-B-C+D;
Step II:Using Hessian matrix determinant approximation images, the approximation image definition is as follows:
If a certain pixel of image I (X) is X=(x, y), then Hessian matrixes are defined at point X with scale σ:
Wherein:Lxx(X, σ) is that image I is filtered at point X with second order GaussConvolution, i.e.,:
Step III:Lxy(X, σ) and LyyThe calculating of (X, σ) is similar.By the Hessian row matrixs for calculating each pixel Column obtains the value of characteristic point, and the Hessian determinants of a matrix are:
Det (H)=LxxLyy-(Lxy)2 (5)
Step IV:Assuming that being D with Hessian matrix parameters are obtained after above-mentioned mask convolutionxx, DxyAnd Dyy, described Hessian determinants of a matrix are approximately:
Det (H)=DxxDyy-(0.9Dxy)2 (6)
The det (H) is the box filter response in the peripheral region point I (x, y), and extreme point is carried out with det (H) Detection;
Step V:Judge that extreme point, specific judgment method are that the judgment mode is by determinant of a matrix and feature Value judges, if determinant is just, and characteristic value jack per line, then the point is extreme point;
Step VI:Det (H) value of 26 points in three dimensional neighborhood is calculated, then will be crossed by Hessian matrix disposals every A pixel is compared with det (H) value of 26 points, finally retains the extreme point all bigger or all small than 26 neighborhood values, As preliminary characteristic point;26 points of the three dimensional neighborhood include 8 neighborhood points in itself scale layer with above and it Under two scale layers 9 neighborhood points;
Step VII:Strongest characteristic point is selected, specifically the characteristic point of sub-pixel in order to obtain, is inserted using linear Value method removes the point less than threshold value first, and then increasing extreme value makes the characteristic point quantity detected reduce, and finally remains Feature point feature is most strong.
As further improvement of the present invention, the generation feature description vector specifically includes following steps:
Step I:It calculates using each characteristic point as the center of circle, 6 σ (σ is characterized a little corresponding scale) are the circle of radius In domain, Haar small echo response of the characteristic point on the direction x, y, the Haar small echos response weights letter using the Gauss that scale is 2 σ Number carries out Gauss weighting, and closer from characteristic point, then weight is bigger;
Step II:It centered on characteristic point, is slided in border circular areas using the fan-shaped window of π/3, and to sliding Image in window Haar small echo responses add up, and direction corresponding when responding accumulated value maximum the Haar is characterized Point principal direction;
Step III:Centered on characteristic point, reference axis is rotated into principal direction first, it is 20 σ to choose the length of side by principal direction Square area, then by region division 4x4 sub-regions, to calculating the Haar within the scope of 5 σ x5 σ in every sub-regions Small echo responds;
Step IV:If dxAnd dyIndicate that the horizontal Haar small echos in vertical direction respond respectively, first of all for enhancing robust Property is to dx、dyWeighting, then by the d in each sub-regionsx、dy、|dx|、|dy| summation obtains a four dimensional vector υ=(∑ dx,∑dy,∑|dx|,∑|dy|), the vector of all subregions constitutes the feature vector of the point, length 64;
Step V:Feature vector is normalized, rotation, scale and illumination invariant shape are made it have.
As further improvement of the present invention, the step 3 is carried out using two-way quick approximate KNN searching algorithm Images match is as follows:
Step (1):Since Eriocheir sinensis back image zooming-out algorithm can lose a part of marginal information in the present invention, lead The characteristic point with the image border part for accelerating robust feature algorithm to detect is caused to have certain interference.To the spy detected Sign point is pre-processed, and is specifically included the Eriocheir sinensis back image binaryzation processing first to splitting, is then extracted Back edge profile finally deletes the characteristic point with a certain distance from back edge profile;
Step (2):Image M is found by quick approximate KNN searching algorithm1In characteristic point m1In image Q2In With point m2, matching is denoted as to (m1, m2).Then image M is found with same method2In characteristic point m2In image Q1The matching of clock Point m3, matching is denoted as to (m2, m3);
Step (3):To characteristic point m2The obtained corresponding points m of matching twice1、m3Judged.If m1、m3It indicates It is image I1In the same characteristic point, then judge successful match;If m1、m3That indicate is image I1In different characteristic point, Then judgement is error hiding pair.
As further improvement of the present invention, the method that error hiding characteristic point is detected in the step 4 is using basis The characteristic point of successful match respectively in two images relative to the position relationship of arest neighbors characteristic point and time neighbour's characteristic point into Row detection, it is specific as follows:
Step (1):If (m1,m2,…,mn), (m1′,m2′,…,mn') it is image I respectively1With image I2In characteristic point, Wherein, m1With m1' be a pair of of successful match characteristic point, and so on;
Step (2):First to using the matching of two-way quick approximate KNN searching algorithm successful match to according to European Distance is ranked up from small to large, then filters out matching pair in the top and arest neighbors and time neighborhood matching pair, described Euclidean distance is defined as follows:
Wherein (x1,x2,x3,…,x64), (x1′,x2′,x3′,…,x64') be successful match two characteristic points feature Vector;
Step (3):The matching pair for first taking out arest neighbors and time neighbour, is denoted as (mnear,mnear') and (msubnear, msubnear'), arest neighbors in two images is then calculated to the distance of secondary neighbour institute match point, and calculation formula is as follows:
D=| mnear-msubnear| (8)
D '=| mnear′-msubnear′| (9)
Step (4):Ranking is taken out successively to match to (m to subsequent in arest neighbors and time neighborhood matchingi,mi') (wherein, 2< I≤0.7*n), the distance that this is paired to arest neighbors match point is calculated, the calculation formula is as follows:
Di=| mi-mnear| (10)
Di'=| mi′-mnear′| (11)
Step (5):The distance for calculating each characteristic point to arest neighbors match point in image Q and image M is matched with arest neighbors The ratio of the distance of point and time neighborhood matching point, the calculation formula are as follows:
Step (6):Using arest neighbors in image Q and image M to time Neighbor Points as positive direction, it is denoted as respectively WithThen the angle that nearest neighbor point is formed to each characteristic point and positive direction is calculated, the calculation formula is such as Under:.
Wherein mi_subpoint, mi_subpoint' it is characteristic point miAnd mi' respectively in straight line mnearmsubnearAnd mnear′msubnear′ On subpoint.The angle A ngle calculated hereiniAnd Anglei' ranging from [0,2 π];
Step (7):Judge distance proportion RiAnd Ri', angle A ngleiAnd Anglei' difference, if the difference be less than threshold Value then judges matching to (mi,mi') it is correct matching pair.
As further improvement of the present invention, the step 5 specifically by the correct matching pair of statistics number, with Correct matching is to number and similarity of the ratio as two images for always matching number, and the calculation formula of the similarity is such as Under:
If similarity is more than some threshold value, assert that two images are same crabs.
3. advantageous effect
Using technical solution provided by the invention, compared with prior art, there is following remarkable result:
A kind of Eriocheir sinensis uniqueness recognition methods based on images match of the present invention will accelerate robust feature algorithm Algorithm is applied to quick approximate KNN searching algorithm, and judges detection in combination with characteristic point error hiding to realize Sinensis Chela crab uniqueness recognizer, when image has the influences such as rotation, translation and noise, the present invention has preferable robust Property, the accurate rate of Eriocheir sinensis images match is significantly improved, ensure that Eriocheir sinensis uniqueness identifies reliable effect; It is shown according to experiment simulation effect, experiment effect of the present invention is preferable, easy to operation, has definitely actual use value;It utilizes Integral image can be calculated quickly, substantially increase computational efficiency;When calculating integral image, it is only necessary to an original image is traversed, Therefore calculation amount is smaller.
Description of the drawings
Fig. 1 is a kind of flow chart of the Eriocheir sinensis uniqueness recognition methods based on images match;
Fig. 2 is the flow chart for extracting Eriocheir sinensis back image M;
Fig. 3 is to utilize the design sketch after the image characteristic point of quick robust feature algorithm extraction Eriocheir sinensis back;
Fig. 4 is the design sketch after the characteristic point deleted apart from back image border certain distance;
Fig. 5 is the matching effect figure of two same Eriocheir sinensis back characteristics of image;
Fig. 6 is the matching effect figure of the back characteristics of image of two width difference Eriocheir sinensis.
Specific implementation mode
To further appreciate that present disclosure, the present invention is described in detail in conjunction with the accompanying drawings and embodiments.
Embodiment 1
As shown in Figure 1, Figure 2, shown in Fig. 3, Fig. 4 and Fig. 5, a kind of Eriocheir sinensis uniqueness based on images match of the invention Recognition methods includes the following steps:
Step 1:Eriocheir sinensis original image A is acquired, the back image M of Eriocheir sinensis is then partitioned into;
Step (1):Background image is removed, obtains Eriocheir sinensis background graphics segmentation figure 5.;
Step I:Eriocheir sinensis back original image A is shot, the shooting tool is the camera terminal for having camera;
Step II:1. collected Eriocheir sinensis original image A is switched into gray-scale map;
Step III:1. noise reduction process is carried out to gray-scale map using gaussian filtering, obtains gaussian filtering figure 2.;
Step IV:The edge of gaussian filtering figure 2. is detected using sobel operators, obtains sobel operator detections figure 3.;
Step V:Using Threshold segmentation 3. sobel operator detection figures are subjected to binaryzation, obtain Threshold segmentation figure 4.;
Step VI:Gray inversion is 4. carried out to Threshold segmentation figure obtains mask figure m1, then 4. Threshold segmentation figure is done expanded, Holes filling and erosion operation obtain mask figure m2;
Step VII:Two figures of m1 and m2 are shipped into calculation, obtain Eriocheir sinensis background graphics segmentation figure 5..
Step (2):5. Eriocheir sinensis background graphics segmentation figure is handled, obtain back graphics expansion figure 7.;
Step I:First corrosion Eriocheir sinensis background graphics segmentation figure 5. then to the background graphics segmentation figure after corrosion 5. Empty filling is carried out, obtains image 6.;
Step II:Dorsal area is filtered out according to the size of image 6. connected domain, obtains mask figure m3;
Step III:Mask figure m3 is expanded, keeps it identical as the back feature size in Eriocheir sinensis original image A, obtains Back graphics expansion figure is 7..
Step (3):7. back graphics expansion figure is shipped into calculation with original image A, you can obtain the back of Eriocheir sinensis Image M.
Step 2:The characteristic point of Eriocheir sinensis back image M is extracted, the extraction Eriocheir sinensis back image M's Method used by characteristic point is to accelerate robust feature algorithm, and this method includes extracting characteristic point and generating feature description vector two A process.
The extraction characteristic point specifically includes following steps:
Step I:By integral image, the integral image is defined as follows:
If X=(x, y) indicates a certain pixel of image I (X), then integral image IΣ(X) indicate with point X=(x, y) and Image origin is the sum of the pixel being formed by angular vertex in rectangular area, is formulated as follows:
Calculate integral image when, it is only necessary to traverse an original image, therefore calculation amount very little, if rectangular area by vertex A, B, 4 points of compositions of C and D, then the sum of gray value of rectangular area is ∑=A-B-C+D;
Step II:Using Hessian matrix determinant approximation images, the approximation image definition is as follows:
If a certain pixel of image I (X) is X=(x, y), then Hessian matrixes are defined at point X with scale σ:
Wherein:Lxx(X, σ) is that image I is filtered at point X with second order GaussConvolution, i.e.,:
Step III:Lxy(X, σ) and LyyThe calculating of (X, σ) is similar.By the Hessian row matrixs for calculating each pixel Column obtains the value of characteristic point, and the Hessian determinants of a matrix are:
Det (H)=LxxLyy-(Lxy)2 (5)
Step IV:Assuming that being D with Hessian matrix parameters are obtained after above-mentioned mask convolutionxx, DxyAnd Dyy, described Hessian determinants of a matrix are approximately:
Det (H)=DxxDyy-(0.9Dxy)2 (6)
The det (H) is the box filter response in the peripheral region point I (x, y), and extreme point is carried out with det (H) Detection;
Step V:Judge extreme point, the judgment mode is to judge by determinant of a matrix and characteristic value, if ranks Formula is just and characteristic value jack per line, then the point is extreme point;
In order to detect characteristic point on different scale, the scale space for establishing image is needed.Accelerating robust feature algorithm In, image size remains unchanged, and changes the size of box filter, is quickly calculated and greatly improved the efficiency using integral image;
Step VI:Det (H) value of 26 points in three dimensional neighborhood is calculated, then will be crossed by Hessian matrix disposals every A pixel is compared with det (H) value of 26 points, finally retains the extreme point all bigger or all small than 26 neighborhood values, As preliminary characteristic point;26 points of the three dimensional neighborhood include 8 neighborhood points in itself scale layer with above and it Under two scale layers 9 neighborhood points;
Step VII:Strongest characteristic point is selected, specifically the characteristic point of sub-pixel in order to obtain, is inserted using linear Value method removes the point less than threshold value first, and then increasing extreme value makes the characteristic point quantity detected reduce, and finally remains Feature point feature is most strong, and threshold value chooses 1500 in the present embodiment.
As further improvement of the present invention, the generation feature description vector specifically includes following steps:
Step I:In order to ensure feature point description symbol has rotational invariance, need to distribute a main side to each characteristic point To.It calculates using each characteristic point as the center of circle, 6 σ (σ is characterized a little corresponding scale) are characteristic point in the border circular areas of radius Haar small echos response on the direction x, y, the Haar small echos response carry out Gauss using the gaussian weighing function that scale is 2 σ Weighting, closer from characteristic point, then weight is bigger;
Step II:It centered on characteristic point, is slided in border circular areas using the fan-shaped window of π/3, and to sliding Image in window Haar small echo responses add up, and direction corresponding when responding accumulated value maximum the Haar is characterized Point principal direction;
Step III:Centered on characteristic point, reference axis is rotated into principal direction first, it is 20 σ to choose the length of side by principal direction Square area, then by region division 4x4 sub-regions, to calculating the Haar within the scope of 5 σ x5 σ in every sub-regions Small echo responds;
Step IV:If dxAnd dyIndicate that the horizontal Haar small echos in vertical direction respond respectively, first of all for enhancing robust Property is to dx、dyWeighting, then by the d in each sub-regionsx、dy、|dx|、|dy| summation obtains a four dimensional vector υ=(∑ dx,∑dy,∑|dx|,∑|dy|), the vector of all subregions constitutes the feature vector of the point, length 64;
Step V:Feature vector is normalized, rotation, scale and illumination invariant shape are made it have.
Step 3:The Eriocheir sinensis back image Q preserved and characteristic point are extracted from database, using two-way quick Approximate KNN searching algorithm carries out images match, matches the characteristic point of image Q and image M, is as follows:
Step (1):Since Eriocheir sinensis back image zooming-out algorithm can lose a part of marginal information in the present invention, lead The characteristic point with the image border part for accelerating robust feature algorithm to detect is caused to have certain interference.To the spy detected Sign point is pre-processed, and is specifically included the Eriocheir sinensis back image binaryzation processing first to splitting, is then extracted Back edge profile finally deletes the characteristic point with a certain distance from back edge profile;
Step (2):Image M is found by quick approximate KNN searching algorithm1In characteristic point m1In image Q2In With point m2, matching is denoted as to (m1, m2).Then image M is found with same method2In characteristic point m2In image Q1The matching of clock Point m3, matching is denoted as to (m2, m3);
Step (3):To characteristic point m2The obtained corresponding points m of matching twice1、m3Judged.If m1、m3It indicates It is image I1In the same characteristic point, then judge successful match;If m1、m3That indicate is image I1In different characteristic point, Then judgement is error hiding pair.
Step 4:Error hiding characteristic point is detected, the method for the detection error hiding characteristic point is using according to successful match Position relationship of the characteristic point respectively in two images relative to arest neighbors characteristic point and time neighbour characteristic point be detected, have Body is as follows:
Step (1):If (m1,m2,…,mn), (m1′,m2′,…,mn') it is image I respectively1With image I2In characteristic point, Wherein, m1With m1' be a pair of of successful match characteristic point, and so on;
Step (2):First to using the matching of two-way quick approximate KNN searching algorithm successful match to according to European Distance is ranked up from small to large, then filters out matching pair in the top and arest neighbors and time neighborhood matching pair, described Euclidean distance is defined as follows:
Wherein (x1,x2,x3,…,x64), (x1′,x2′,x3′,…,x64') be successful match two characteristic points feature Vector;
Step (3):The matching pair for first taking out arest neighbors and time neighbour, is denoted as (mnear,mnear') and (msubnear, msubnear'), arest neighbors in two images is then calculated to the distance of secondary neighbour institute match point, and calculation formula is as follows:
D=| mnear-msubnear| (8)
D '=| mnear′-msubnear′| (9)
Step (4):Ranking is taken out successively to match to (m to subsequent in arest neighbors and time neighborhood matchingi,mi') (wherein, 2< I≤0.7*n), the distance that this is paired to arest neighbors match point is calculated, the calculation formula is as follows:
Di=| mi-mnear| (10)
Di'=| mi′-mnear′| (11)
Step (5):The distance for calculating each characteristic point to arest neighbors match point in image Q and image M is matched with arest neighbors The ratio of the distance of point and time neighborhood matching point, the calculation formula are as follows:
Step (6):Using arest neighbors in image Q and image M to time Neighbor Points as positive direction, it is denoted as respectively WithThen the angle that nearest neighbor point is formed to each characteristic point and positive direction is calculated, the calculation formula is such as Under:.
Wherein mi_subpoint, mi_subpoint' it is characteristic point miAnd mi' respectively in straight line mnearmsubnearAnd mnear′msubnear′ On subpoint.The angle A ngle calculated hereiniAnd Anglei' ranging from [0,2 π];
Step (7):Judge distance proportion RiAnd Ri', angle A ngleiAnd Anglei' difference, if the difference be less than threshold Value then judges matching to (mi,mi') it is correct matching pair.0.5 is selected apart from proportion threshold value in the present embodiment, angle threshold selects 0.2.
Step 5:The similarity for calculating image Q and image M, specifically by the number of the correct matching pair of statistics, with correct To number and similarity of the ratio as two images for always matching number, the calculation formula of the similarity is as follows for matching:
If similarity is more than threshold value, assert that two images are same crabs;
Threshold value selects 0.8 in the present embodiment;
Step 6:Matching terminates, and exports crab match information.
This implementation will accelerate robust feature algorithm and quick approximate KNN searching algorithm to apply to algorithm, and in combination with Characteristic point error hiding judges detection to realize Eriocheir sinensis uniqueness recognizer, rotates, translates and makes an uproar when image exists When the influences such as sound, the present invention has preferable robustness, significantly improves the accurate rate of Eriocheir sinensis images match, ensure that Eriocheir sinensis uniqueness identifies reliable effect;In addition, being shown according to experiment simulation effect, experiment effect of the present invention is preferable, It is easy to operation, there is definitely actual use value.
Embodiment 2
As shown in Figure 1, Figure 2, shown in Fig. 3, Fig. 4 and Fig. 5, the present embodiment is substantially the same manner as Example 1, it is preferable that the present embodiment Total matching is to number 57, and to number 52, thus similarity 91.2281% can determine whether in two in Fig. 5 for correct matching Magnificent Eriocheir is same.
This implementation will accelerate robust feature algorithm and quick approximate KNN searching algorithm to apply to algorithm, and in combination with Characteristic point error hiding judges detection to realize Eriocheir sinensis uniqueness recognizer, rotates, translates and makes an uproar when image exists When the influences such as sound, the present invention has preferable robustness, significantly improves the accurate rate of Eriocheir sinensis images match, ensure that Eriocheir sinensis uniqueness identifies reliable effect;In addition, being shown according to experiment simulation effect, experiment effect of the present invention is preferable, It is easy to operation, there is definitely actual use value.
Embodiment 3
As shown in Figure 1, Figure 2, shown in Fig. 3, Fig. 4 and Fig. 6, the present embodiment is substantially the same manner as Example 1, it is preferable that the present embodiment Total matching is to number 20, and to number 0, thus similarity 0 can determine whether two Eriocheir sinensis in Fig. 6 for correct matching It is not same.
This implementation will accelerate robust feature algorithm and quick approximate KNN searching algorithm to apply to algorithm, and in combination with Characteristic point error hiding judges detection to realize Eriocheir sinensis uniqueness recognizer, rotates, translates and makes an uproar when image exists When the influences such as sound, the present invention has preferable robustness, significantly improves the accurate rate of Eriocheir sinensis images match, ensure that Eriocheir sinensis uniqueness identifies reliable effect;In addition, being shown according to experiment simulation effect, experiment effect of the present invention is preferable, It is easy to operation, there is definitely actual use value.
Schematically the present invention and embodiments thereof are described above, description is not limiting, institute in attached drawing What is shown is also one of embodiments of the present invention, and actual structure is not limited to this.So if the common skill of this field Art personnel are enlightened by it, without departing from the spirit of the invention, are not inventively designed and the technical solution Similar frame mode and embodiment, are within the scope of protection of the invention.

Claims (11)

1. a kind of Eriocheir sinensis uniqueness recognition methods based on images match, which is characterized in that include the following steps:
Step 1:Eriocheir sinensis original image A is acquired, the back image M of Eriocheir sinensis is then partitioned into;
Step 2:Extract the characteristic point of Eriocheir sinensis back image M;
Step 3:The Eriocheir sinensis back image Q preserved and characteristic point are extracted from database, match image Q and image M Characteristic point;
Step 4:Detect error hiding characteristic point;
Step 5:Calculate the similarity of image Q and image M;
Step 6:Matching terminates, and exports crab match information.
2. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist In the back image M for dividing Eriocheir sinensis in the step 1 is as follows:
Step (1):Background image is removed, obtains Eriocheir sinensis background graphics segmentation figure 5.;
Step (2):5. Eriocheir sinensis background graphics segmentation figure is handled, obtain back graphics expansion figure 7.;
Step (3):The back image M of Eriocheir sinensis finally can be obtained according to the marginal information of Eriocheir sinensis.
3. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 2, feature exist In the step (1) is as follows:
Step I:Shoot Eriocheir sinensis back original image A;
Step II:1. collected Eriocheir sinensis original image A is switched into gray-scale map;
Step III:1. noise reduction process is carried out to gray-scale map using gaussian filtering, obtains gaussian filtering figure 2.;
Step IV:The edge of gaussian filtering figure 2. is detected using sobel operators, obtains sobel operator detections figure 3.;
Step V:Using Threshold segmentation 3. sobel operator detection figures are subjected to binaryzation, obtain Threshold segmentation figure 4.;
Step VI:Gray inversion is 4. carried out to Threshold segmentation figure and obtains mask figure m1, then 4. Threshold segmentation figure is done expanded, hole Filling and erosion operation obtain mask figure m2;
Step VII:Two figures of m1 and m2 are shipped into calculation, obtain Eriocheir sinensis background graphics segmentation figure 5..
4. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 2 or 3, feature It is, the specific processing step of the step (2) is as follows:
Step I:First 5. then 5. corrosion Eriocheir sinensis background graphics segmentation figure carries out the background graphics segmentation figure after corrosion 6. cavity filling, obtains image;
Step II:Dorsal area is filtered out according to the size of image 6. connected domain, obtains mask figure m3;
Step III:Mask figure m3 is expanded, keeps it identical as the back feature size in Eriocheir sinensis original image A, obtains back Graphics expansion figure is 7..
5. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 2, feature exist In the step (3) is specially:7. back graphics expansion figure is shipped into calculation with original image A, you can obtain Eriocheir sinensis Back image M.
6. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist In:It is to accelerate robust feature algorithm that method used by the characteristic point of Eriocheir sinensis back image M is extracted in the step 2, This method includes extraction characteristic point and vectorial two processes of generation feature description.
7. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 6, feature exist In the extraction characteristic point specifically includes following steps:
Step I:By integral image, the integral image is defined as follows:
If X=(x, y) indicates a certain pixel of image I (X), then integral image IΣ(X) it indicates with point X=(x, y) and image Origin is the sum of the pixel being formed by angular vertex in rectangular area, is formulated as follows:
If rectangular area is made of for 4 points vertex A, B, C and D, then the sum of gray value of rectangular area is ∑=A-B-C+D;
Step II:Using Hessian matrix determinant approximation images, the approximation image definition is as follows:
If a certain pixel of image I (X) is X=(x, y), then Hessian matrixes are defined at point X with scale σ:
Wherein:Lxx(X, σ) is that image I is filtered at point X with second order GaussConvolution, i.e.,:
Step III:Calculate the Hessian matrix determinants of each pixel, the row of each pixel Hessian matrixes Column is:
Det (H)=LxxLyy-(Lxy)2 (5)
Step IV:It is D to obtain Hessian matrix parametersxx, DxyAnd Dyy, the determinant of the Hessian matrix parameters is approximately:
Det (H)=DxxDyy-(0.9Dxy)2 (6)
The det (H) is the box filter response in the peripheral region point I (x, y);
Step V:Judge extreme point, if it is just and characteristic value jack per line that specific judgment method, which is determinant, then the point is extreme point;
Step VI:Det (H) value of 26 points in three dimensional neighborhood is calculated, each picture that then will be crossed by Hessian matrix disposals Vegetarian refreshments is compared with det (H) value of 26 points, finally retains the extreme point all bigger or all small than 26 neighborhood values, as Preliminary characteristic point;26 points of the three dimensional neighborhood include 8 neighborhood points in itself scale layer with above and under 9 neighborhood points of two scale layers;
Step VII:Strongest characteristic point is selected, specific method is to use linear interpolation method, is removed first less than threshold value Then point increases extreme value, the feature point feature finally remained is most strong.
8. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 6, feature exist In the generation feature description vector specifically includes following steps:
Step I:Calculate using each characteristic point as the center of circle, 6 σ (σ is characterized a little corresponding scale) for radius border circular areas in, Haar small echo response of the characteristic point on the direction x, y, Haar small echos response using the gaussian weighing function that scale is 2 σ into Row Gauss weights, and closer from characteristic point, then weight is bigger;
Step II:Centered on characteristic point, slided in border circular areas using the fan-shaped window of π/3, and to sliding window Interior image Haar small echo responses add up, and it is characteristic point master that the Haar, which responds direction corresponding when accumulated value maximum, Direction;
Step III:Centered on characteristic point, reference axis is rotated into principal direction first, by principal direction choose the length of side be 20 σ just Square region, then by region division 4x4 sub-regions, to calculating the Haar small echos within the scope of 5 σ x5 σ in every sub-regions Response;
Step IV:If dxAnd dyIndicate that the horizontal Haar small echos in vertical direction respond respectively, first to dx、dyWeighting, then By the d in each sub-regionsx、dy、|dx|、|dy| summation obtains a four dimensional vector υ=(∑ dx,∑dy,∑|dx|,∑|dy |), the vector in the subregion constitutes the feature vector of the point;
Step V:Feature vector is normalized.
9. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist In the step 3 carries out images match using two-way quick approximate KNN searching algorithm, is as follows:
Step (1):The characteristic point detected is pre-processed, the Eriocheir sinensis back first to splitting is specifically included Image binaryzation processing, then extracts back edge profile, finally deletes the characteristic point with a certain distance from back edge profile;
Step (2):Image M is found by quick approximate KNN searching algorithm1In characteristic point m1In image Q2In match point m2, matching is denoted as to (m1, m2).Then image M is found with same method2In characteristic point m2In image Q1The match point of clock m3, matching is denoted as to (m2, m3);
Step (3):To characteristic point m2The obtained corresponding points m of matching twice1、m3Judged.If m1、m3What is indicated is figure As I1In the same characteristic point, then judge successful match.
10. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist In the method for detecting error hiding characteristic point in the step 4 is to use the characteristic point according to successful match respectively in two images In be detected relative to the position relationship of arest neighbors characteristic point and time neighbour characteristic point, it is specific as follows:
Step (1):If (m1,m2,…,mn), (m1′,m2′,…,mn') it is image I respectively1With image I2In characteristic point, wherein m1With m1' be a pair of of successful match characteristic point, and so on;
Step (2):First to using the matching of two-way quick approximate KNN searching algorithm successful match to according to Euclidean distance It is ranked up from small to large, then filters out matching pair in the top and arest neighbors and time neighborhood matching pair, it is described European Distance definition is as follows:
Wherein (x1,x2,x3,…,x64), (x1′,x2′,x3′,…,x64') be successful match two characteristic points feature vector;
Step (3):The matching pair for first taking out arest neighbors and time neighbour, is denoted as (mnear,mnear') and (msubnear,msubnear'), so Arest neighbors in two images is calculated afterwards to the distance of secondary neighbour institute match point, and calculation formula is as follows:
D=| mnear-msubnear| (8)
D '=| mnear′-msubnear′| (9)
Step (4):Ranking is taken out successively to match to (m to subsequent in arest neighbors and time neighborhood matchingi,mi') (wherein, 2<i≤ 0.7*n), the distance that this is paired to arest neighbors match point is calculated, the calculation formula is as follows:
Di=| mi-mnear| (10)
Di'=| mi′-mnear′| (11)
Step (5):Calculate each characteristic point in image Q and image M to the distance and arest neighbors match point of arest neighbors match point and The ratio of the distance of secondary neighborhood matching point, the calculation formula are as follows:
Step (6):Using arest neighbors in image Q and image M to time Neighbor Points as positive direction, it is denoted as respectivelyWithThen the angle that nearest neighbor point is formed to each characteristic point and positive direction is calculated, the calculation formula is such as Under:.
Wherein mi_subpoint, mi_subpoint' it is characteristic point miAnd mi' respectively in straight line mnearmsubnearAnd mnear′msubnear' on Subpoint.The angle A ngle calculated hereiniAnd Anglei' ranging from [0,2 π];
Step (7):Judge distance proportion RiAnd Ri', angle A ngleiAnd Anglei' difference, if the difference be less than threshold value, Judgement matching is to (mi,mi') it is correct matching pair.
11. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist In the step 5 is specifically by the number for counting correct matching pair, correctly to match the ratio to number and total matching number Similarity of the example as two images, the calculation formula of the similarity are as follows:
CN201810207047.3A 2018-03-14 2018-03-14 A kind of Eriocheir sinensis uniqueness recognition methods based on images match Active CN108509870B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810207047.3A CN108509870B (en) 2018-03-14 2018-03-14 A kind of Eriocheir sinensis uniqueness recognition methods based on images match

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810207047.3A CN108509870B (en) 2018-03-14 2018-03-14 A kind of Eriocheir sinensis uniqueness recognition methods based on images match

Publications (2)

Publication Number Publication Date
CN108509870A true CN108509870A (en) 2018-09-07
CN108509870B CN108509870B (en) 2019-07-12

Family

ID=63376549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810207047.3A Active CN108509870B (en) 2018-03-14 2018-03-14 A kind of Eriocheir sinensis uniqueness recognition methods based on images match

Country Status (1)

Country Link
CN (1) CN108509870B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308716A (en) * 2018-09-20 2019-02-05 珠海市君天电子科技有限公司 A kind of image matching method, device, electronic equipment and storage medium
CN110263789A (en) * 2019-02-18 2019-09-20 北京爱数智慧科技有限公司 A kind of object boundary recognition methods, device and equipment
CN112036280A (en) * 2020-08-24 2020-12-04 方海涛 Waterfowl population dynamic monitoring method, device and equipment
CN112766404A (en) * 2021-01-29 2021-05-07 安徽工大信息技术有限公司 Chinese mitten crab authenticity identification method and system based on deep learning
CN113379720A (en) * 2021-06-29 2021-09-10 云南昆船设计研究院有限公司 Tea cake anti-counterfeiting method based on tea cake image feature code

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024033290A1 (en) * 2022-08-09 2024-02-15 Lagosta Sa Method and identification device for identification of a shell of a crustacean

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398937A (en) * 2008-10-29 2009-04-01 北京航空航天大学 Three-dimensional reconstruction method based on fringe photograph collection of same scene
CN101984463A (en) * 2010-11-02 2011-03-09 中兴通讯股份有限公司 Method and device for synthesizing panoramic image
CN102831405A (en) * 2012-08-16 2012-12-19 北京理工大学 Method and system for outdoor large-scale object identification on basis of distributed and brute-force matching
CN103455803A (en) * 2013-09-04 2013-12-18 哈尔滨工业大学 Non-contact type palm print recognition method based on iteration random sampling unification algorithm
CN104933434A (en) * 2015-06-16 2015-09-23 同济大学 Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN105741295A (en) * 2016-02-01 2016-07-06 福建师范大学 High-resolution remote sensing image registration method based on local invariant feature point
CN107103317A (en) * 2017-04-12 2017-08-29 湖南源信光电科技股份有限公司 Fuzzy license plate image recognition algorithm based on image co-registration and blind deconvolution
CN107622247A (en) * 2017-09-26 2018-01-23 华东师范大学 A kind of positioning of express waybill and extracting method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398937A (en) * 2008-10-29 2009-04-01 北京航空航天大学 Three-dimensional reconstruction method based on fringe photograph collection of same scene
CN101984463A (en) * 2010-11-02 2011-03-09 中兴通讯股份有限公司 Method and device for synthesizing panoramic image
CN102831405A (en) * 2012-08-16 2012-12-19 北京理工大学 Method and system for outdoor large-scale object identification on basis of distributed and brute-force matching
CN103455803A (en) * 2013-09-04 2013-12-18 哈尔滨工业大学 Non-contact type palm print recognition method based on iteration random sampling unification algorithm
CN104933434A (en) * 2015-06-16 2015-09-23 同济大学 Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN105741295A (en) * 2016-02-01 2016-07-06 福建师范大学 High-resolution remote sensing image registration method based on local invariant feature point
CN107103317A (en) * 2017-04-12 2017-08-29 湖南源信光电科技股份有限公司 Fuzzy license plate image recognition algorithm based on image co-registration and blind deconvolution
CN107622247A (en) * 2017-09-26 2018-01-23 华东师范大学 A kind of positioning of express waybill and extracting method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
伍梦 等: "基于SURF算法的侧扫声呐图像配准", 《江西科学》 *
赵璐璐 等: "基于SURF和快速近似最近邻搜索的图像匹配算法", 《计算机应用研究》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308716A (en) * 2018-09-20 2019-02-05 珠海市君天电子科技有限公司 A kind of image matching method, device, electronic equipment and storage medium
CN110263789A (en) * 2019-02-18 2019-09-20 北京爱数智慧科技有限公司 A kind of object boundary recognition methods, device and equipment
CN112036280A (en) * 2020-08-24 2020-12-04 方海涛 Waterfowl population dynamic monitoring method, device and equipment
CN112766404A (en) * 2021-01-29 2021-05-07 安徽工大信息技术有限公司 Chinese mitten crab authenticity identification method and system based on deep learning
CN113379720A (en) * 2021-06-29 2021-09-10 云南昆船设计研究院有限公司 Tea cake anti-counterfeiting method based on tea cake image feature code
CN113379720B (en) * 2021-06-29 2022-08-09 云南昆船设计研究院有限公司 Tea cake anti-counterfeiting method based on tea cake image feature code

Also Published As

Publication number Publication date
CN108509870B (en) 2019-07-12

Similar Documents

Publication Publication Date Title
CN108509870B (en) A kind of Eriocheir sinensis uniqueness recognition methods based on images match
CN103761799B (en) A kind of bill anti-counterfeit method based on texture image feature and device
CN103345758B (en) Jpeg image region duplication based on DCT statistical nature distorts blind checking method
CN102136058B (en) Bar code image identification method
CN106156684B (en) A kind of two-dimensional code identification method and device
CN104217221A (en) Method for detecting calligraphy and paintings based on textural features
CN104537544A (en) Commodity two-dimensional code anti-fake method and system provided with covering layer and based on background texture feature extraction algorithm
Christlein et al. A study on features for the detection of copy-move forgeries
CN101894260A (en) Method for identifying forgery seal based on feature line randomly generated by matching feature points
CN109858439A (en) A kind of biopsy method and device based on face
CN104182973A (en) Image copying and pasting detection method based on circular description operator CSIFT (Colored scale invariant feature transform)
CN102892048B (en) Video watermark anti-counterfeiting method capable of resisting geometric attacks
Abidin et al. Copy-move image forgery detection using deep learning methods: a review
CN1290047C (en) File anti-fake method and its device based on digital water print
CN106060568A (en) Video tampering detecting and positioning method
CN106815731A (en) A kind of label anti-counterfeit system and method based on SURF Image Feature Matchings
CN109509151A (en) Image and video-splicing method, computer readable storage medium and computer equipment
CN104182882B (en) A kind of product digital watermark anti-fake anti-channel conflict information and its application process
CN113435219B (en) Anti-counterfeiting detection method and device, electronic equipment and storage medium
CN110533704A (en) Fake method, device, equipment and medium are tested in the identification of ink label
CN107134048A (en) A kind of bill anti-counterfeit discrimination method of Intelligent Recognition watermark feature
CN110443306B (en) Authenticity identification method for wine cork
CN107316072A (en) Dimension code anti-counterfeit method, anti-counterfeit authentication method and the false proof device of offline synchronization
Yohannan et al. Detection of copy-move forgery based on Gabor filter
Ali et al. Image forgery localization using image patches and deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CB03 Change of inventor or designer information

Inventor after: Wang Xiaolin

Inventor after: Tai Weipeng

Inventor after: Li Hao

Inventor after: Wang Jie

Inventor after: Zhang Bingliang

Inventor before: Tai Weipeng

Inventor before: Li Hao

Inventor before: Wang Jie

Inventor before: Zhang Bingliang

Inventor before: Wang Xiaolin

CB03 Change of inventor or designer information