CN108509870B - A kind of Eriocheir sinensis uniqueness recognition methods based on images match - Google Patents
A kind of Eriocheir sinensis uniqueness recognition methods based on images match Download PDFInfo
- Publication number
- CN108509870B CN108509870B CN201810207047.3A CN201810207047A CN108509870B CN 108509870 B CN108509870 B CN 108509870B CN 201810207047 A CN201810207047 A CN 201810207047A CN 108509870 B CN108509870 B CN 108509870B
- Authority
- CN
- China
- Prior art keywords
- image
- characteristic point
- eriocheir sinensis
- point
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 241000371997 Eriocheir sinensis Species 0.000 title claims abstract description 89
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000001514 detection method Methods 0.000 claims abstract description 16
- 238000004422 calculation algorithm Methods 0.000 claims description 28
- HUTDUHSNJYTCAR-UHFFFAOYSA-N ancymidol Chemical compound C1=CC(OC)=CC=C1C(O)(C=1C=NC=NC=1)C1CC1 HUTDUHSNJYTCAR-UHFFFAOYSA-N 0.000 claims description 27
- 238000004364 calculation method Methods 0.000 claims description 22
- 239000011159 matrix material Substances 0.000 claims description 19
- 239000013598 vector Substances 0.000 claims description 17
- 230000011218 segmentation Effects 0.000 claims description 16
- 230000004044 response Effects 0.000 claims description 15
- 238000001914 filtration Methods 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 4
- 238000005260 corrosion Methods 0.000 claims description 3
- 230000007797 corrosion Effects 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 230000002093 peripheral effect Effects 0.000 claims description 3
- 238000011946 reduction process Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 2
- 238000005303 weighing Methods 0.000 claims description 2
- 230000003628 erosive effect Effects 0.000 claims 1
- 238000007781 pre-processing Methods 0.000 claims 1
- 238000010191 image analysis Methods 0.000 abstract description 2
- 238000007689 inspection Methods 0.000 abstract description 2
- 230000000694 effects Effects 0.000 description 16
- 238000002474 experimental method Methods 0.000 description 8
- 230000006872 improvement Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 241000238557 Decapoda Species 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 241000040710 Chela Species 0.000 description 1
- 241000371986 Eriocheir Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000796 flavoring agent Substances 0.000 description 1
- 235000019634 flavors Nutrition 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 239000003643 water by type Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G06T5/70—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/446—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering using Haar-like filters, e.g. using integral image techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
Abstract
The Eriocheir sinensis uniqueness recognition methods based on images match that the invention discloses a kind of, belongs to digital image processing field.The present invention is the following steps are included: step 1: then acquisition Eriocheir sinensis original image A divides the back image M of Eriocheir sinensis;Step 2: the characteristic point of Eriocheir sinensis back image M is extracted;Step 3: the Eriocheir sinensis back image Q saved and characteristic point, the characteristic point of matching image Q and image M are extracted from database;Step 4: detection error hiding characteristic point;Step 5: the similarity of image Q and image M are calculated;Step 6: matching terminates, and exports crab match information.Present invention aims to overcome that the problem of Eriocheir sinensis uniqueness can not be efficiently identified in the existing method of inspection, by utilizing widely distributed in crab shell various recess, protrusion and Texture eigenvalue, using image analysis processing, then it is compared with the image data base deposited, the uniqueness of Eriocheir sinensis is judged according to similarity.
Description
Technical field
The invention belongs to technical field of image processing, are related to a kind of image matching algorithm, in particular to a kind of to be based on image
Matched Eriocheir sinensis uniqueness recognition methods.
Background technique
Eriocheir japonica sinensis is also known as river crab, steamed crab or crab, delicious flavour, full of nutrition, is the traditional rare water in China
One of product.In recent years, very burning hot, the ground using Yangcheng Lake steamed crab as Eriocheir japonica sinensis consumption market every autumn and winter of representative
Reason sign protection product is increasingly emerged in large numbers with the trade mark that waters is named.But being driven by interests makes, in the market fake products remain incessant after repeated prohibition,
Especially palm off place of production phenomenon especially severe.Product is counterfeited in order to prevent, and various regions take a variety of anti-counterfeit measures.Existing market
Upper common method for anti-counterfeit is to adhere to anti-counterfeiting mark on steamed crab, then website, telephone number or the short message number by announcing
Code come verify mark in number (sequence number) or anti-fake code.The method of attachment has to be encrypted at random on crab shell laser incising
Code, or anti-fake crab button etc. is put in crab leg.Common method for anti-counterfeit has the disadvantage that in the market
1. logical separation: what anti-counterfeiting mark was logically separate with steamed crab, that is, the object verified is not steamed crab sheet
Body, and it is attached to the anti-counterfeiting mark on steamed crab.
2. conclusion is fuzzy: no matter using phone, short message or network inquiry, the result of inquiry can only be that anti-counterfeiting mark has
Nothing was verified, and cannot be obtained true and false.Opposite vacation code may be accredited as very when being verified for the 1st time, and true code is at the 2nd time
Vacation is accredited as when verifying instead.
3. being easy duplication: the structure of anti-counterfeiting mark is easy counterfeit, it might even be possible to accomplish the duplication of 1:N.
4. cannot trace to the source: anti-counterfeiting mark extensive management, the sequence number of anti-counterfeiting mark are not associated with production information suggestion, because
This cannot track and trace to the source, and sequence number does not have specific life cycle.
For now, the anti-fake investigative technique of crab is some as follows, and it is following also to the principle of its relevant art and its
Advantage and disadvantage are made that specific introduction.
1. being carried out using antiforge button anti-fake
Fake certification code is printed on antiforge button, consumer can be by dialing inquiry phone, sending short message or logging in crab
The anti-fake examination official website input fake certification code of crab is inquired.
Advantage: counterfeit protection measures are relatively convenient, and cost is relatively low for antiforge button.
Disadvantage: having antiforge button, is not necessarily authentic steamed crab.Antiforge button is easy to be copied, and the true and false of antiforge button is difficult to
Identification.It may cause false crab and be certified into authentic crab.And authentic crab leads to certification not due to false crab predictive authentication
Success.
2. being carried out using two-dimension code label anti-fake
By scanning the two-dimensional code the two dimensional code on label, judge whether crab is authentic Yangcheng Lake or other well-known product
Board crab.
Advantage: in today that electric business is carried out rapidly, as two dimensional code now most prevailing and wechat path, many companies are all
With from company's masses' account and company's two dimensional code, national 315 product false proof Help Centers also thoroughly by two dimensional code with
It is anti-fake perfectly to link together.Anti-fake service not only can be efficiently provided using two-dimension code label, moreover it is possible to help company
Publicity and implementation.
Disadvantage: with antiforge button the shortcomings that is the same, and be easy to cause two-dimension code label is not necessarily authentic steamed crab.
3. laser inscription is anti-fake
Laser branding is stamped at crab back using laser stamp, may include that house mark, net enclose number, product sequence
Number, supervise unit and specialize in the contents such as area.
Advantage: laser inscription mark depth as shallow does not hurt crab, does not influence survival rate, and the font of mark is clearly easily distinguished.And
Nuisanceless, at home and abroad market has higher reputational degree.
Disadvantage: laser inscription is also easier to be counterfeited, if wicked business in the imprinting of common crab back with orthodox school
Steamed crab the same identifying code or product serial number, consumer still cannot be distinguished.
It is retrieved, existing related patents scheme discloses.Such as Chinese patent: application number: 201410068984.7, proprietary term
Claim: a kind of Antiforge recognizing method of color image, publication date: 2014.05.07, which disclose a kind of the anti-of color image
Pseudo- recognition methods, comprising the following steps: raw color image data acquisition;Analysis is handled and is stored;Color image number to be identified
According to acquisition;Analysis is handled and is compared.It is anti-fake to improve color image by color password for Antiforge recognizing method provided by the invention
Rank, other factories under the premise of not obtaining primitive color password, be difficult it is imitated go out and the consistent new coloured silk of original color image
Chromatic graph picture copies original color image to efficiently avoid other producers, greatly protects original color figure
The interests of the manufacturer of picture.And the method that consumer can also provide through the invention efficiently identifies out color image
The true and false.But disadvantage is that the patent is not suitable for the detection of the biological uniqueness such as crab, it is unable to reach Eriocheir sinensis
Anti-counterfeit recognition.
Summary of the invention
1. technical problems to be solved by the inivention
Aiming to overcome that for this patent can not efficiently identify asking for Eriocheir sinensis uniqueness in the existing method of inspection
Topic, provides a kind of Eriocheir sinensis uniqueness recognition methods based on images match, by using widely distributed in crab shell
Then various recess, protrusion and Texture eigenvalue are compared, root using image analysis processing with the image data base deposited
The uniqueness of Eriocheir sinensis is judged according to similarity.
2. technical solution
In order to achieve the above objectives, technical solution provided by the invention are as follows:
A kind of Eriocheir sinensis uniqueness recognition methods based on images match of the invention, comprising the following steps:
Step 1: then acquisition Eriocheir sinensis original image A is partitioned into the back image M of Eriocheir sinensis;
Step 2: the characteristic point of Eriocheir sinensis back image M is extracted;
Step 3: extracting the Eriocheir sinensis back image Q saved and characteristic point from database, matching image Q and
The characteristic point of image M;
Step 4: detection error hiding characteristic point;
Step 5: the similarity of image Q and image M are calculated;
Step 6: matching terminates, and exports crab match information.
As further improvement of the present invention, the specific of the back image M of Eriocheir sinensis is divided in the step 1
Steps are as follows:
Step (1): 5. removal background image obtains Eriocheir sinensis background graphics segmentation figure;
Step (2): 5. handling Eriocheir sinensis background graphics segmentation figure, obtains back graphics expansion figure 7.;
Step (3): the back image M of Eriocheir sinensis finally can be obtained according to the marginal information of Eriocheir sinensis.
As further improvement of the present invention, specific step is as follows for the step (1):
Step I: shooting Eriocheir sinensis back original image A, the shooting tool is the camera terminal for having camera;
Step II: 1. collected Eriocheir sinensis original image A is switched into grayscale image;
Step III: 1. noise reduction process being carried out to grayscale image using gaussian filtering, obtains gaussian filtering figure 2.;
Step IV: detecting the edge of gaussian filtering figure 2. using sobel operator, obtain sobel operator detection figure 3.;
Step V: using Threshold segmentation 3. sobel operator detection figure being subjected to binaryzation, obtain Threshold segmentation figure 4.;
Step VI: gray inversion is 4. carried out to Threshold segmentation figure obtains exposure mask figure m1, then 4. Threshold segmentation figure is done expanded,
Holes filling and erosion operation obtain exposure mask figure m2;
Step VII: two figures of m1 and m2 being shipped into calculation, obtain Eriocheir sinensis background graphics segmentation figure 5..
As further improvement of the present invention, the specific processing step of the step (2) is as follows:
Step I: first corrode Eriocheir sinensis background graphics segmentation figure 5. and then to the background graphics segmentation figure after corrosion 5.
Empty filling is carried out, obtains image 6.;
Step II: dorsal area is filtered out according to the size of image 6. connected domain, obtains exposure mask figure m3;
Step III: expansion exposure mask figure m3 keeps it identical as the back feature size in Eriocheir sinensis original image A, obtains
Back graphics expansion figure is 7..
As further improvement of the present invention, the step (3) specifically: by back graphics expansion figure 7. with original image
A ships calculation, and the back image M of Eriocheir sinensis can be obtained.
As further improvement of the present invention, the characteristic point of Eriocheir sinensis back image M is extracted in the step 2
Used method is to accelerate robust feature algorithm, and this method includes extracting characteristic point and generating two mistakes of feature description vectors
Journey.
As further improvement of the present invention, the extraction characteristic point specifically includes the following steps:
Step I: by integral image, the integral image is defined as follows:
If X=(x, y) indicates a certain pixel of image I (X), then integral image IΣ(X) indicate with point X=(x, y) and
Image origin is the sum of the pixel being formed by rectangular area to angular vertex, is formulated as follows:
Calculate integral image when, it is only necessary to traverse an original image, therefore calculation amount very little, if rectangular area by vertex A,
B, C and 4 points of D compositions, then the sum of gray value of rectangular area is ∑=A-B-C+D;
Step II: using Hessian matrix determinant approximation image, and the approximation image definition is as follows:
If a certain pixel of image I (X) is X=(x, y), then Hessian matrix is at point X with scale σ definition:
Wherein: Lxx(X, σ) is that image I is filtered at point X with second order GaussConvolution, it may be assumed that
Step III: Lxy(X, σ) and LyyThe calculating of (X, σ) is similar.By the Hessian row matrix for calculating each pixel
Column obtains the value of characteristic point, the Hessian determinant of a matrix are as follows:
Det (H)=LxxLyy-(Lxy)2 (5)
Step IV: assuming that being D with Hessian matrix parameter is obtained after above-mentioned mask convolutionxx, DxyAnd Dyy, described
Hessian determinant of a matrix is approximately:
Det (H)=DxxDyy-(0.9Dxy)2 (6)
The det (H) is the box filter response in the peripheral region point I (x, y), carries out extreme point with det (H)
Detection;
Step V: judging extreme point, specific judgment method is that the judgment mode is by determinant of a matrix and feature
Value judges, if determinant is positive, and characteristic value jack per line, then the point is extreme point;
Step VI: calculating det (H) value of 26 points in three dimensional neighborhood, then will be crossed by Hessian matrix disposal every
A pixel is compared with det (H) value of 26 points, finally retains the extreme point all bigger or all small than 26 neighborhood values,
As preliminary characteristic point;26 points of the three dimensional neighborhood include 8 neighborhood points in itself scale layer with above and it
Under two scale layers 9 neighborhood points;
Step VII: selecting strongest characteristic point, specifically the characteristic point of sub-pixel in order to obtain, inserted using linear
Value method removes the point less than threshold value first, and then increasing extreme value reduces the characteristic point quantity detected, finally remains
Feature point feature is most strong.
As further improvement of the present invention, the generation feature description vectors specifically includes the following steps:
Step I: calculating using each characteristic point as the center of circle, and 6 σ (σ is characterized a little corresponding scale) are the circle of radius
In domain, Haar small echo response of the characteristic point on the direction x, y, the Haar small echo response weights letter using the Gauss that scale is 2 σ
Number carries out Gauss weighting, and closer from characteristic point, then weight is bigger;
Step II: it centered on characteristic point, is slided in border circular areas using the fan-shaped window of π/3, and to sliding
Image in window Haar small echo response adds up, and the direction Haar corresponding when responding accumulated value maximum is characterized
Point principal direction;
Step III: centered on characteristic point, reference axis being rotated into principal direction first, choosing side length by principal direction is 20 σ
Square area, then by region division 4x4 sub-regions, to the Haar calculated in each subregion within the scope of 5 σ x5 σ
Small echo response;
Step IV: setting dxAnd dyIt respectively indicates the horizontal Haar small echo in vertical direction to respond, first of all for enhancing robust
Property is to dx、dyWeighting, then by the d in each sub-regionsx、dy、|dx|、|dy| summation obtains a four dimensional vector υ=(∑
dx,∑dy,∑|dx|,∑|dy|), the vector of all subregions constitutes the feature vector of the point, length 64;
Step V: feature vector being normalized, rotation, scale and illumination invariant shape are made it have.
As further improvement of the present invention, the step 3 is carried out using two-way quick approximate KNN searching algorithm
Images match, the specific steps are as follows:
Step (1): it since Eriocheir sinensis back image zooming-out algorithm can lose a part of marginal information in the present invention, leads
Cause to have certain interference with the characteristic point for the image border part for accelerating robust feature algorithm to detect.To the spy detected
Sign point is pre-processed, and is specifically included and is handled first the Eriocheir sinensis back image binaryzation split, then extracts
Back edge profile finally deletes the characteristic point with a certain distance from back edge profile;
Step (2): image M is found by quick approximate KNN searching algorithm1In characteristic point m1In image Q2In
With point m2, matching is denoted as to (m1, m2).Then image M is found with same method2In characteristic point m2In image Q1The matching of clock
Point m3, matching is denoted as to (m2, m3);
Step (3): to characteristic point m2The obtained corresponding points m of matching twice1、m3Judged.If m1、m3It indicates
It is image I1In the same characteristic point, then determine successful match;If m1、m3That indicate is image I1In different characteristic point,
Then determine to be error hiding pair.
As further improvement of the present invention, the method that error hiding characteristic point is detected in the step 4 is using basis
The characteristic point of successful match respectively in two images relative to the positional relationship of arest neighbors characteristic point and time neighbour's characteristic point into
Row detection, specific as follows:
Step (1): (m is set1,m2,…,mn), (m1′,m2′,…,mn') it is image I respectively1With image I2In characteristic point,
Wherein, m1With m1' be a pair of of successful match characteristic point, and so on;
Step (2): first to use the matching of two-way quick approximate KNN searching algorithm successful match to according to European
Distance is ranked up from small to large, then filters out matching pair and arest neighbors and time neighborhood matching pair in the top, described
Euclidean distance is defined as follows:
Wherein (x1,x2,x3,…,x64), (x1′,x2′,x3′,…,x64') be successful match two characteristic points feature
Vector;
Step (3): the matching pair of arest neighbors and time neighbour is first taken out, (m is denoted asnear,mnear') and (msubnear,
msubnear'), arest neighbors in two images is then calculated to the distance of secondary neighbour institute match point, and calculation formula is as follows:
D=| mnear-msubnear| (8)
D '=| mnear′-msubnear′| (9)
Step (4): it successively takes out ranking and is matched to subsequent to (m in arest neighbors and time neighborhood matchingi,mi') (wherein, 2 <
I≤0.7*n), the distance that this is paired to arest neighbors match point is calculated, the calculation formula is as follows:
Di=| mi-mnear| (10)
Di'=| mi′-mnear′| (11)
Step (5): the distance for calculating each characteristic point to arest neighbors match point in image Q and image M is matched with arest neighbors
The ratio of the distance of point and time neighborhood matching point, the calculation formula are as follows:
Step (6): it using arest neighbors in image Q and image M to time Neighbor Points as positive direction, is denoted as respectively
WithThen the angle that nearest neighbor point is formed to each characteristic point and positive direction, the calculation formula are calculated
It is as follows:.
Wherein mi_subpoint, mi_subpoint' it is characteristic point miAnd mi' respectively in straight line mnearmsubnearAnd mnear′msubnear′
On subpoint.The angle A ngle calculated hereiniAnd Anglei' range be [0,2 π];
Step (7): judge distance proportion RiAnd Ri', angle A ngleiAnd Anglei' difference, if the difference be less than threshold
Value then determines matching to (mi,mi') it is correct matching pair.
As further improvement of the present invention, the step 5 specifically by the correct matching pair of statistics number, with
Correct matching is to number and similarity of the ratio as two images for always matching number, and the calculation formula of the similarity is such as
Under:
If similarity is greater than some threshold value, assert that two images are same crabs.
3. beneficial effect
Using technical solution provided by the invention, compared with prior art, there is following remarkable result:
A kind of Eriocheir sinensis uniqueness recognition methods based on images match of the invention will accelerate robust feature algorithm
Algorithm is applied to quick approximate KNN searching algorithm, and judges detection in combination with characteristic point error hiding to realize Sinensis
Chela crab uniqueness recognizer, when image, which has rotation, translation and noise etc., to be influenced, the present invention has preferable robust
Property, the accurate rate of Eriocheir sinensis images match is significantly improved, ensure that Eriocheir sinensis uniqueness identifies reliable effect;
It is shown according to experiment simulation effect, experiment effect of the present invention is preferable, convenient for operation, has definitely actual use value;It utilizes
Integral image can be calculated quickly, substantially increase computational efficiency;When calculating integral image, it is only necessary to an original image is traversed,
Therefore calculation amount is smaller.
Detailed description of the invention
Fig. 1 is a kind of flow chart of Eriocheir sinensis uniqueness recognition methods based on images match;
Fig. 2 is the flow chart for extracting Eriocheir sinensis back image M;
Fig. 3 is the effect picture extracted after the image characteristic point of Eriocheir sinensis back using quick robust feature algorithm;
Fig. 4 is the effect picture after the characteristic point deleted apart from back image border certain distance;
Fig. 5 is the matching effect figure of two same Eriocheir sinensis back characteristics of image;
Fig. 6 is the matching effect figure of the back characteristics of image of two width difference Eriocheir sinensis.
Specific embodiment
To further appreciate that the contents of the present invention, the present invention is described in detail in conjunction with the accompanying drawings and embodiments.
Embodiment 1
As shown in Figure 1, Figure 2, shown in Fig. 3, Fig. 4 and Fig. 5, a kind of Eriocheir sinensis uniqueness based on images match of the invention
Recognition methods, comprising the following steps:
Step 1: then acquisition Eriocheir sinensis original image A is partitioned into the back image M of Eriocheir sinensis;
Step (1): 5. removal background image obtains Eriocheir sinensis background graphics segmentation figure;
Step I: shooting Eriocheir sinensis back original image A, the shooting tool is the camera terminal for having camera;
Step II: 1. collected Eriocheir sinensis original image A is switched into grayscale image;
Step III: 1. noise reduction process being carried out to grayscale image using gaussian filtering, obtains gaussian filtering figure 2.;
Step IV: detecting the edge of gaussian filtering figure 2. using sobel operator, obtain sobel operator detection figure 3.;
Step V: using Threshold segmentation 3. sobel operator detection figure being subjected to binaryzation, obtain Threshold segmentation figure 4.;
Step VI: gray inversion is 4. carried out to Threshold segmentation figure obtains exposure mask figure m1, then 4. Threshold segmentation figure is done expanded,
Holes filling and erosion operation obtain exposure mask figure m2;
Step VII: two figures of m1 and m2 being shipped into calculation, obtain Eriocheir sinensis background graphics segmentation figure 5..
Step (2): 5. handling Eriocheir sinensis background graphics segmentation figure, obtains back graphics expansion figure 7.;
Step I: first corrode Eriocheir sinensis background graphics segmentation figure 5. and then to the background graphics segmentation figure after corrosion 5.
Empty filling is carried out, obtains image 6.;
Step II: dorsal area is filtered out according to the size of image 6. connected domain, obtains exposure mask figure m3;
Step III: expansion exposure mask figure m3 keeps it identical as the back feature size in Eriocheir sinensis original image A, obtains
Back graphics expansion figure is 7..
Step (3): 7. back graphics expansion figure is shipped into calculation with original image A, the back of Eriocheir sinensis can be obtained
Image M.
Step 2: extracting the characteristic point of Eriocheir sinensis back image M, described to extract Eriocheir sinensis back image M's
Method used by characteristic point is to accelerate robust feature algorithm, and this method includes extracting characteristic point and generating feature description vectors two
A process.
The extraction characteristic point specifically includes the following steps:
Step I: by integral image, the integral image is defined as follows:
If X=(x, y) indicates a certain pixel of image I (X), then integral image IΣ(X) indicate with point X=(x, y) and
Image origin is the sum of the pixel being formed by rectangular area to angular vertex, is formulated as follows:
Calculate integral image when, it is only necessary to traverse an original image, therefore calculation amount very little, if rectangular area by vertex A,
B, C and 4 points of D compositions, then the sum of gray value of rectangular area is ∑=A-B-C+D;
Step II: using Hessian matrix determinant approximation image, and the approximation image definition is as follows:
If a certain pixel of image I (X) is X=(x, y), then Hessian matrix is at point X with scale σ definition:
Wherein: Lxx(X, σ) is that image I is filtered at point X with second order GaussConvolution, it may be assumed that
Step III: Lxy(X, σ) and LyyThe calculating of (X, σ) is similar.By the Hessian row matrix for calculating each pixel
Column obtains the value of characteristic point, the Hessian determinant of a matrix are as follows:
Det (H)=LxxLyy-(Lxy)2 (5)
Step IV: assuming that being D with Hessian matrix parameter is obtained after above-mentioned mask convolutionxx, DxyAnd Dyy, described
Hessian determinant of a matrix is approximately:
Det (H)=DxxDyy-(0.9Dxy)2 (6)
The det (H) is the box filter response in the peripheral region point I (x, y), carries out extreme point with det (H)
Detection;
Step V: judging extreme point, the judgment mode is to judge by determinant of a matrix and characteristic value, if ranks
Formula is positive, and characteristic value jack per line, then the point is extreme point;
In order to detect characteristic point on different scale, need to establish the scale space of image.Accelerating robust feature algorithm
In, image size remains unchanged, and changes the size of box filter, is quickly calculated and greatly improved the efficiency using integral image;
Step VI: calculating det (H) value of 26 points in three dimensional neighborhood, then will be crossed by Hessian matrix disposal every
A pixel is compared with det (H) value of 26 points, finally retains the extreme point all bigger or all small than 26 neighborhood values,
As preliminary characteristic point;26 points of the three dimensional neighborhood include 8 neighborhood points in itself scale layer with above and it
Under two scale layers 9 neighborhood points;
Step VII: selecting strongest characteristic point, specifically the characteristic point of sub-pixel in order to obtain, inserted using linear
Value method removes the point less than threshold value first, and then increasing extreme value reduces the characteristic point quantity detected, finally remains
Feature point feature is most strong, and threshold value chooses 1500 in the present embodiment.
As further improvement of the present invention, the generation feature description vectors specifically includes the following steps:
Step I: it in order to ensure feature point description symbol has rotational invariance, needs to distribute a main side to each characteristic point
To.It calculates using each characteristic point as the center of circle, 6 σ (σ is characterized a little corresponding scale) are characteristic point in the border circular areas of radius
Haar small echo response on the direction x, y, the Haar small echo response carry out Gauss using the gaussian weighing function that scale is 2 σ
Weighting, closer from characteristic point, then weight is bigger;
Step II: it centered on characteristic point, is slided in border circular areas using the fan-shaped window of π/3, and to sliding
Image in window Haar small echo response adds up, and the direction Haar corresponding when responding accumulated value maximum is characterized
Point principal direction;
Step III: centered on characteristic point, reference axis being rotated into principal direction first, choosing side length by principal direction is 20 σ
Square area, then by region division 4x4 sub-regions, to the Haar calculated in each subregion within the scope of 5 σ x5 σ
Small echo response;
Step IV: setting dxAnd dyIt respectively indicates the horizontal Haar small echo in vertical direction to respond, first of all for enhancing robust
Property is to dx、dyWeighting, then by the d in each sub-regionsx、dy、|dx|、|dy| summation obtains a four dimensional vector υ=(∑
dx,∑dy,∑|dx|,∑|dy|), the vector of all subregions constitutes the feature vector of the point, length 64;
Step V: feature vector being normalized, rotation, scale and illumination invariant shape are made it have.
Step 3: extracting the Eriocheir sinensis back image Q saved and characteristic point from database, using two-way quick
Approximate KNN searching algorithm carries out images match, the characteristic point of matching image Q and image M, the specific steps are as follows:
Step (1): it since Eriocheir sinensis back image zooming-out algorithm can lose a part of marginal information in the present invention, leads
Cause to have certain interference with the characteristic point for the image border part for accelerating robust feature algorithm to detect.To the spy detected
Sign point is pre-processed, and is specifically included and is handled first the Eriocheir sinensis back image binaryzation split, then extracts
Back edge profile finally deletes the characteristic point with a certain distance from back edge profile;
Step (2): image M is found by quick approximate KNN searching algorithm1In characteristic point m1In image Q2In
With point m2, matching is denoted as to (m1, m2).Then image M is found with same method2In characteristic point m2In image Q1The matching of clock
Point m3, matching is denoted as to (m2, m3);
Step (3): to characteristic point m2The obtained corresponding points m of matching twice1、m3Judged.If m1、m3It indicates
It is image I1In the same characteristic point, then determine successful match;If m1、m3That indicate is image I1In different characteristic point,
Then determine to be error hiding pair.
Step 4: the method for detection error hiding characteristic point, the detection error hiding characteristic point is using according to successful match
Positional relationship of the characteristic point respectively in two images relative to arest neighbors characteristic point and time neighbour characteristic point detect, have
Body is as follows:
Step (1): (m is set1,m2,…,mn), (m1′,m2′,…,mn') it is image I respectively1With image I2In characteristic point,
Wherein, m1With m1' be a pair of of successful match characteristic point, and so on;
Step (2): first to use the matching of two-way quick approximate KNN searching algorithm successful match to according to European
Distance is ranked up from small to large, then filters out matching pair and arest neighbors and time neighborhood matching pair in the top, described
Euclidean distance is defined as follows:
Wherein (x1,x2,x3,…,x64), (x1′,x2′,x3′,…,x64') be successful match two characteristic points feature
Vector;
Step (3): the matching pair of arest neighbors and time neighbour is first taken out, (m is denoted asnear,mnear') and (msubnear,
msubnear'), arest neighbors in two images is then calculated to the distance of secondary neighbour institute match point, and calculation formula is as follows:
D=| mnear-msubnear| (8)
D '=| mnear′-msubnear′| (9)
Step (4): it successively takes out ranking and is matched to subsequent to (m in arest neighbors and time neighborhood matchingi,mi') (wherein, 2 <
I≤0.7*n), the distance that this is paired to arest neighbors match point is calculated, the calculation formula is as follows:
Di=| mi-mnear| (10)
Di'=| mi′-mnear′| (11)
Step (5): the distance for calculating each characteristic point to arest neighbors match point in image Q and image M is matched with arest neighbors
The ratio of the distance of point and time neighborhood matching point, the calculation formula are as follows:
Step (6): it using arest neighbors in image Q and image M to time Neighbor Points as positive direction, is denoted as respectively
WithThen the angle that nearest neighbor point is formed to each characteristic point and positive direction, the calculation formula are calculated
It is as follows:.
Wherein mi_subpoint, mi_subpoint' it is characteristic point miAnd mi' respectively in straight line mnearmsubnearAnd mnear′msubnear′
On subpoint.The angle A ngle calculated hereiniAnd Anglei' range be [0,2 π];
Step (7): judge distance proportion RiAnd Ri', angle A ngleiAnd Anglei' difference, if the difference be less than threshold
Value then determines matching to (mi,mi') it is correct matching pair.Distance proportion threshold value selects 0.5 in the present embodiment, and angle threshold selects 0.2.
Step 5: calculating the similarity of image Q and image M, specifically by the number of the correct matching pair of statistics, with correct
To number and similarity of the ratio as two images for always matching number, the calculation formula of the similarity is as follows for matching:
If similarity is greater than threshold value, assert that two images are same crabs;
Threshold value selects 0.8 in the present embodiment;
Step 6: matching terminates, and exports crab match information.
This implementation will accelerate robust feature algorithm and quick approximate KNN searching algorithm to apply to algorithm, and in combination with
The judgement detection of characteristic point error hiding rotates, translates and makes an uproar when image exists to realize Eriocheir sinensis uniqueness recognizer
When sound etc. influences, the present invention has preferable robustness, significantly improves the accurate rate of Eriocheir sinensis images match, ensure that
Eriocheir sinensis uniqueness identifies reliable effect;In addition, being shown according to experiment simulation effect, experiment effect of the present invention is preferable,
Convenient for operation, there is definitely actual use value.
Embodiment 2
As shown in Figure 1, Figure 2, shown in Fig. 3, Fig. 4 and Fig. 5, the present embodiment is substantially the same manner as Example 1, it is preferable that the present embodiment
Total matching is to number 57, and to number 52, thus similarity 91.2281% can determine whether in two in Fig. 5 for correct matching
Magnificent Eriocheir is same.
This implementation will accelerate robust feature algorithm and quick approximate KNN searching algorithm to apply to algorithm, and in combination with
The judgement detection of characteristic point error hiding rotates, translates and makes an uproar when image exists to realize Eriocheir sinensis uniqueness recognizer
When sound etc. influences, the present invention has preferable robustness, significantly improves the accurate rate of Eriocheir sinensis images match, ensure that
Eriocheir sinensis uniqueness identifies reliable effect;In addition, being shown according to experiment simulation effect, experiment effect of the present invention is preferable,
Convenient for operation, there is definitely actual use value.
Embodiment 3
As shown in Figure 1, Figure 2, shown in Fig. 3, Fig. 4 and Fig. 6, the present embodiment is substantially the same manner as Example 1, it is preferable that the present embodiment
Total matching is to number 20, and to number 0, thus similarity 0 can determine whether two Eriocheir sinensis in Fig. 6 for correct matching
It is not same.
This implementation will accelerate robust feature algorithm and quick approximate KNN searching algorithm to apply to algorithm, and in combination with
The judgement detection of characteristic point error hiding rotates, translates and makes an uproar when image exists to realize Eriocheir sinensis uniqueness recognizer
When sound etc. influences, the present invention has preferable robustness, significantly improves the accurate rate of Eriocheir sinensis images match, ensure that
Eriocheir sinensis uniqueness identifies reliable effect;In addition, being shown according to experiment simulation effect, experiment effect of the present invention is preferable,
Convenient for operation, there is definitely actual use value.
Schematically the present invention and embodiments thereof are described above, description is not limiting, institute in attached drawing
What is shown is also one of embodiments of the present invention, and actual structure is not limited to this.So if the common skill of this field
Art personnel are enlightened by it, without departing from the spirit of the invention, are not inventively designed and the technical solution
Similar frame mode and embodiment, are within the scope of protection of the invention.
Claims (8)
1. a kind of Eriocheir sinensis uniqueness recognition methods based on images match, which comprises the following steps:
Step 1: then acquisition Eriocheir sinensis original image A is partitioned into the back image M of Eriocheir sinensis;
Step (1): removal background image obtains Eriocheir sinensis background graphics segmentation figure;
Step (2): handling Eriocheir sinensis background graphics segmentation figure, obtains back graphics expansion figure;
Step (3): the back image M of Eriocheir sinensis finally can be obtained according to the marginal information of Eriocheir sinensis;
Step 2: the characteristic point of Eriocheir sinensis back image M is extracted;Extract the characteristic point institute of Eriocheir sinensis back image M
For the method used to accelerate robust feature algorithm, this method includes extracting characteristic point and generation two processes of feature description vectors;
Step 3: the Eriocheir sinensis back image Q and characteristic point, matching image Q and image M saved is extracted from database
Characteristic point;
Step 4: detection error hiding characteristic point, specific method are using the characteristic point according to successful match respectively in two width figures
Positional relationship as in relative to arest neighbors characteristic point and time neighbour's characteristic point detects, specific as follows:
Step (1): (m is set1,m2,…,mn), (m1′,m2′,…,mn') it is image I respectively1With image I2In characteristic point, wherein
m1With m1' be a pair of of successful match characteristic point, and so on;
Step (2): first to use the matching of two-way quick approximate KNN searching algorithm successful match to according to Euclidean distance
It is ranked up from small to large, then filters out matching pair and arest neighbors and time neighborhood matching pair in the top, it is described European
Distance definition is as follows:
Wherein (x1,x2,x3,…,x64), (x1′,x2′,x3′,…,x64') be successful match two characteristic points feature vector;
Step (3): the matching pair of arest neighbors and time neighbour is first taken out, (m is denoted asnear,mnear') and (msubnear,msubnear'), so
Arest neighbors in two images is calculated afterwards to the distance of secondary neighbour institute match point, and calculation formula is as follows:
D=| mnear-msubnear| (8)
D '=| mnear′-msubnear′| (9)
Step (4): it successively takes out ranking and is matched to subsequent to (m in arest neighbors and time neighborhood matchingi,mi') (wherein, 2 < i≤
0.7*n), the distance that this is paired to arest neighbors match point is calculated, the calculation formula is as follows:
Di=| mi-mnear| (10)
Di'=| mi′-mnear′| (11)
Step (5): calculate each characteristic point in image Q and image M to distance and the arest neighbors match point of arest neighbors match point and
The ratio of the distance of secondary neighborhood matching point, the calculation formula are as follows:
Step (6): it using arest neighbors in image Q and image M to time Neighbor Points as positive direction, is denoted as respectivelyWithThen the angle that nearest neighbor point is formed to each characteristic point and positive direction is calculated, the calculation formula is such as
Under:
Wherein mi_subpoint, mi_subpoint' it is characteristic point miAnd mi' respectively in straight line mnearmsubnearAnd mnear′msubnear' on
Subpoint, the angle A ngle calculated hereiniAnd Anglei' range be [0,2 π];
Step (7): judge distance proportion RiAnd Ri', angle A ngleiAnd Anglei' difference, if the difference be less than threshold value,
Determine matching to (mi,mi') it is correct matching pair;
Step 5: the similarity of image Q and image M are calculated;
Step 6: matching terminates, and exports crab match information.
2. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist
In specific step is as follows for the step (1):
Step I: shooting Eriocheir sinensis back original image A;
Step II: collected Eriocheir sinensis original image A is switched into grayscale image;
Step III: noise reduction process being carried out to grayscale image using gaussian filtering, obtains gaussian filtering figure;
Step IV: detecting the edge of gaussian filtering figure using sobel operator, obtain sobel operator detection figure;
Step V: sobel operator being detected into figure using Threshold segmentation and carries out binaryzation, obtains Threshold segmentation figure;
Step VI: gray inversion being carried out to Threshold segmentation figure and obtains exposure mask figure m1, then Threshold segmentation figure is done and is expanded, holes filling
Exposure mask figure m2 is obtained with erosion operation;
Step VII: two figures of m1 and m2 being shipped into calculation, obtain Eriocheir sinensis background graphics segmentation figure.
3. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1 or 2, feature
It is, the specific processing step of the step (2) is as follows:
Step I: first corrode Eriocheir sinensis background graphics segmentation figure and cavity then is carried out to the background graphics segmentation figure after corrosion
Filling, obtains image;
Step II: dorsal area is filtered out according to the size in image connectivity domain, obtains exposure mask figure m3;
Step III: expansion exposure mask figure m3 keeps it identical as the back feature size in Eriocheir sinensis original image A, obtains back
Graphics expansion figure.
4. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 3, feature exist
In the step (3) specifically: back graphics expansion figure and original image A are shipped calculation, the back of Eriocheir sinensis can be obtained
Portion image M.
5. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist
In, the extraction characteristic point specifically includes the following steps:
Step I: by integral image, the integral image is defined as follows:
If X=(x, y) indicates a certain pixel of image I (X), then integral image IΣ(X) it indicates with point X=(x, y) and image
Origin is the sum of the pixel being formed by rectangular area to angular vertex, is formulated as follows:
If rectangular area is constituted by 4 points of vertex A, B, C and D, then the sum of gray value of rectangular area is ∑=A-B-C+D;
Step II: using Hessian matrix determinant approximation image, and the approximation image definition is as follows:
If a certain pixel of image I (X) is X=(x, y), then Hessian matrix is at point X with scale σ definition:
Wherein: Lxx(X, σ) is that image I is filtered at point X with second order GaussConvolution, it may be assumed that
Step III: calculating the Hessian matrix determinant of each pixel, the row of each pixel Hessian matrix
Column are as follows:
Det (H)=LxxLyy-(Lxy)2 (5)
Step IV: obtaining Hessian matrix parameter is Dxx, DxyAnd Dyy, the determinant of the Hessian matrix parameter is approximately:
Det (H)=DxxDyy-(0.9Dxy)2 (6)
The det (H) is the box filter response in the peripheral region point I (x, y);
Step V: judging extreme point, if specific judgment method is that determinant is positive, and characteristic value jack per line, then the point is extreme point;
Step VI: calculating det (H) value of 26 points in three dimensional neighborhood, each picture that then will be crossed by Hessian matrix disposal
Vegetarian refreshments is compared with det (H) value of 26 points, finally retains the extreme point all bigger or all small than 26 neighborhood values, as
Preliminary characteristic point;26 points of the three dimensional neighborhood include 8 neighborhood points in itself scale layer with above and under
9 neighborhood points of two scale layers;
Step VII: selecting strongest characteristic point, specific method is to remove first using linear interpolation method less than threshold value
Then point increases extreme value, the feature point feature finally remained is most strong.
6. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 5, feature exist
In, generation feature description vectors specifically includes the following steps:
Step I: calculate using each characteristic point as the center of circle, 6 σ (σ is characterized a little corresponding scale) for radius border circular areas in,
Haar small echo response of the characteristic point on the direction x, y, Haar small echo response using the gaussian weighing function that scale is 2 σ into
Row Gauss weighting, closer from characteristic point, then weight is bigger;
Step II: it centered on characteristic point, is slided in border circular areas using the fan-shaped window of π/3, and to sliding window
Interior image Haar small echo response adds up, and it is characteristic point master that the Haar, which responds direction corresponding when accumulated value maximum,
Direction;
Step III: centered on characteristic point, reference axis is rotated into principal direction first, by principal direction choose side length be 20 σ just
Square region, then by region division 4x4 sub-regions, to the Haar small echo calculated in each subregion within the scope of 5 σ x5 σ
Response;
Step IV: setting dxAnd dyIt respectively indicates the horizontal Haar small echo in vertical direction to respond, first to dx、dyWeighting, then
By the d in each sub-regionsx、dy、|dx|、|dy| summation obtains a four dimensional vector υ=(∑ dx,∑dy,∑|dx|,∑|dy
|), the vector in the subregion constitutes the feature vector of the point;
Step V: feature vector is normalized.
7. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist
In the step 3 carries out images match using two-way quick approximate KNN searching algorithm, the specific steps are as follows:
Step (1): pre-processing the characteristic point detected, specifically includes first to the Eriocheir sinensis back split
Image binaryzation processing, then extracts back edge profile, finally deletes the characteristic point with a certain distance from back edge profile;
Step (2): image M is found by quick approximate KNN searching algorithm1In characteristic point m1In image Q2In match point
m2, matching is denoted as to (m1, m2), then image M is found with same method2In characteristic point m2In image Q1The match point of clock
m3, matching is denoted as to (m2, m3);
Step (3): to characteristic point m2The obtained corresponding points m of matching twice1、m3Judged, if m1、m3Indicate be
Image I1In the same characteristic point, then determine successful match.
8. a kind of Eriocheir sinensis uniqueness recognition methods based on images match according to claim 1, feature exist
In the step 5 is specifically by the number for counting correct matching pair, correctly to match the ratio to number and total matching number
Similarity of the example as two images, the calculation formula of the similarity are as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810207047.3A CN108509870B (en) | 2018-03-14 | 2018-03-14 | A kind of Eriocheir sinensis uniqueness recognition methods based on images match |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810207047.3A CN108509870B (en) | 2018-03-14 | 2018-03-14 | A kind of Eriocheir sinensis uniqueness recognition methods based on images match |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108509870A CN108509870A (en) | 2018-09-07 |
CN108509870B true CN108509870B (en) | 2019-07-12 |
Family
ID=63376549
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810207047.3A Active CN108509870B (en) | 2018-03-14 | 2018-03-14 | A kind of Eriocheir sinensis uniqueness recognition methods based on images match |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108509870B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024033290A1 (en) * | 2022-08-09 | 2024-02-15 | Lagosta Sa | Method and identification device for identification of a shell of a crustacean |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109308716A (en) * | 2018-09-20 | 2019-02-05 | 珠海市君天电子科技有限公司 | A kind of image matching method, device, electronic equipment and storage medium |
CN109871846A (en) * | 2019-02-18 | 2019-06-11 | 北京爱数智慧科技有限公司 | A kind of object boundary recognition methods, device and equipment |
CN112036280A (en) * | 2020-08-24 | 2020-12-04 | 方海涛 | Waterfowl population dynamic monitoring method, device and equipment |
CN112766404A (en) * | 2021-01-29 | 2021-05-07 | 安徽工大信息技术有限公司 | Chinese mitten crab authenticity identification method and system based on deep learning |
CN113379720B (en) * | 2021-06-29 | 2022-08-09 | 云南昆船设计研究院有限公司 | Tea cake anti-counterfeiting method based on tea cake image feature code |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101398937A (en) * | 2008-10-29 | 2009-04-01 | 北京航空航天大学 | Three-dimensional reconstruction method based on fringe photograph collection of same scene |
CN101984463A (en) * | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | Method and device for synthesizing panoramic image |
CN103455803A (en) * | 2013-09-04 | 2013-12-18 | 哈尔滨工业大学 | Non-contact type palm print recognition method based on iteration random sampling unification algorithm |
CN104933434A (en) * | 2015-06-16 | 2015-09-23 | 同济大学 | Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method |
CN105741295A (en) * | 2016-02-01 | 2016-07-06 | 福建师范大学 | High-resolution remote sensing image registration method based on local invariant feature point |
CN107103317A (en) * | 2017-04-12 | 2017-08-29 | 湖南源信光电科技股份有限公司 | Fuzzy license plate image recognition algorithm based on image co-registration and blind deconvolution |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102831405B (en) * | 2012-08-16 | 2014-11-26 | 北京理工大学 | Method and system for outdoor large-scale object identification on basis of distributed and brute-force matching |
CN107622247B (en) * | 2017-09-26 | 2020-08-25 | 华东师范大学 | Express waybill positioning and extracting method |
-
2018
- 2018-03-14 CN CN201810207047.3A patent/CN108509870B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101398937A (en) * | 2008-10-29 | 2009-04-01 | 北京航空航天大学 | Three-dimensional reconstruction method based on fringe photograph collection of same scene |
CN101984463A (en) * | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | Method and device for synthesizing panoramic image |
CN103455803A (en) * | 2013-09-04 | 2013-12-18 | 哈尔滨工业大学 | Non-contact type palm print recognition method based on iteration random sampling unification algorithm |
CN104933434A (en) * | 2015-06-16 | 2015-09-23 | 同济大学 | Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method |
CN105741295A (en) * | 2016-02-01 | 2016-07-06 | 福建师范大学 | High-resolution remote sensing image registration method based on local invariant feature point |
CN107103317A (en) * | 2017-04-12 | 2017-08-29 | 湖南源信光电科技股份有限公司 | Fuzzy license plate image recognition algorithm based on image co-registration and blind deconvolution |
Non-Patent Citations (2)
Title |
---|
基于SURF和快速近似最近邻搜索的图像匹配算法;赵璐璐 等;《计算机应用研究》;20130331;第30卷(第3期);第921-923页 |
基于SURF算法的侧扫声呐图像配准;伍梦 等;《江西科学》;20171231;第35卷(第6期);第897-901、912页 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024033290A1 (en) * | 2022-08-09 | 2024-02-15 | Lagosta Sa | Method and identification device for identification of a shell of a crustacean |
Also Published As
Publication number | Publication date |
---|---|
CN108509870A (en) | 2018-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108509870B (en) | A kind of Eriocheir sinensis uniqueness recognition methods based on images match | |
CN102136058B (en) | Bar code image identification method | |
CN103761799B (en) | A kind of bill anti-counterfeit method based on texture image feature and device | |
CN106485183B (en) | A kind of Quick Response Code localization method and system | |
CN103914680B (en) | A kind of spray printing character picture identification and check system and method | |
AU2008335636B2 (en) | Identification and verification of an unknown document according to an Eigen image process | |
CN104537544A (en) | Commodity two-dimensional code anti-fake method and system provided with covering layer and based on background texture feature extraction algorithm | |
CN104217221A (en) | Method for detecting calligraphy and paintings based on textural features | |
CN110766594B (en) | Information hiding method and device, detection method and device and anti-counterfeiting tracing method | |
Christlein et al. | A study on features for the detection of copy-move forgeries | |
CN103345758A (en) | Joint photographic experts group (JPEG) image region copying and tampering blind detection method based on discrete cosine transformation (DCT) statistical features | |
CN104182973A (en) | Image copying and pasting detection method based on circular description operator CSIFT (Colored scale invariant feature transform) | |
CN104969268A (en) | Authentication of security documents and mobile device to carry out the authentication | |
CN113313225B (en) | Anti-counterfeiting method based on sparse dot matrix code | |
CN108009538A (en) | A kind of automobile engine cylinder-body sequence number intelligent identification Method | |
CN1290047C (en) | File anti-fake method and its device based on digital water print | |
CN114241248B (en) | River crab origin tracing method and system | |
CN113435219B (en) | Anti-counterfeiting detection method and device, electronic equipment and storage medium | |
CN108038482A (en) | A kind of automobile engine cylinder-body sequence number Visual intelligent identifying system | |
CN113392664B (en) | Anti-counterfeiting two-dimensional code generation method, anti-counterfeiting method and device | |
CN107134048A (en) | A kind of bill anti-counterfeit discrimination method of Intelligent Recognition watermark feature | |
CN110378351A (en) | Seal discrimination method and device | |
CN106934756B (en) | Method and system for embedding information in single-color or special-color image | |
CN110619060B (en) | Cigarette carton image database construction method and cigarette carton anti-counterfeiting query method | |
CN107316072A (en) | Dimension code anti-counterfeit method, anti-counterfeit authentication method and the false proof device of offline synchronization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CB03 | Change of inventor or designer information |
Inventor after: Wang Xiaolin Inventor after: Tai Weipeng Inventor after: Li Hao Inventor after: Wang Jie Inventor after: Zhang Bingliang Inventor before: Tai Weipeng Inventor before: Li Hao Inventor before: Wang Jie Inventor before: Zhang Bingliang Inventor before: Wang Xiaolin |
|
CB03 | Change of inventor or designer information |