CN108830281A - A kind of multiimage matching process based on localized variation detection and spatial weighting - Google Patents

A kind of multiimage matching process based on localized variation detection and spatial weighting Download PDF

Info

Publication number
CN108830281A
CN108830281A CN201810527166.7A CN201810527166A CN108830281A CN 108830281 A CN108830281 A CN 108830281A CN 201810527166 A CN201810527166 A CN 201810527166A CN 108830281 A CN108830281 A CN 108830281A
Authority
CN
China
Prior art keywords
projective transformation
point
matching
image
match point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810527166.7A
Other languages
Chinese (zh)
Other versions
CN108830281B (en
Inventor
熊健
楼婧蕾
周仕雪
王姮冰
桂冠
杨洁
范山岗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201810527166.7A priority Critical patent/CN108830281B/en
Publication of CN108830281A publication Critical patent/CN108830281A/en
Application granted granted Critical
Publication of CN108830281B publication Critical patent/CN108830281B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Abstract

The invention discloses a kind of multiimage matching process based on localized variation detection and spatial weighting, specific steps include:Image to be matched information is extracted by SIFT algorithm;Carry out projective transformation;Three indexs of detection image:Distance, abnormal point, projective transformation amplitude between match point;Objective function is solved with least square method:It matches dot position information similarity maximum and projective transformation amplitude is minimum, obtain projective transformation parameter;The matching dot position information similarity under this projective transformation parameter is calculated, the multilevel iudge images match situation with threshold value is passed through.Influence of the tiny transformation of shooting angle to result is simulated by introducing projective transformation, it is contemplated that influence of the spatial distribution of pixel to result, while influence of the abnormal point to matching result is added, have great importance to the multiimage detection of monitor supervision platform.

Description

A kind of multiimage matching process based on localized variation detection and spatial weighting
Technical field
The invention belongs to technical field of image detection, and in particular to a kind of weight based on localized variation detection and spatial weighting Complex pattern matching process.
Background technique
Continuous improvement with the development and people of scientific and technological information technology to demand for security, video monitoring system are much being led Domain is all widely used.As the important component of security product, video monitoring product accounts for its ratio on 50% left side The right side, during 2014-2016, ratio of the video monitoring in security product application is respectively 47.06%, 48.33% and 50.63%.During 2010-2017, China's video monitoring market scale rises to 112,400,000,000 yuan from 24,200,000,000 yuan, average annual compound Growth rate is up to 24.53%.
For the multiimage detection in monitor video, domestic and foreign scholars have done a large amount of research work.Main image Matching algorithm can be classified as four classes:Matching algorithm, feature-based matching algorithm, the matching algorithm based on model based on region With the matching algorithm based on transform domain.The characteristic point in image to be matched, such as brightness, edge, angle point, profile information are extracted, It carries out matching to be at present most generally being also most effective multiimage detection method in conjunction with measuring similarity criterion.
SIFT (Scale invariant features transform) is a kind of local feature description's, and SIFT feature matching algorithm can handle two The matching problem in the case of translation, rotation, affine transformation occurs between width image, there is very strong matching capacity;SIFT algorithm The matching characteristic point extracted lacks location information, does not consider the influence of spatial distribution and abnormal point, is directly used in multiimage Matching lack certain reasonability.
Clustering is also known as cluster analysis, is a kind of multi-variate statistical analysis that quantitative classification is carried out to multiple samples (or index) Method.Q type clustering is also known as into classification to sample.
The present invention introduces two-dimensional Gaussian function and Q type clustering, proposes one on the basis of SIFT algorithm as a result, Multiimage matching process of the kind based on localized variation detection and spatial weighting.
Summary of the invention
To solve deficiency in the prior art, the present invention provides a kind of repetition based on localized variation detection and spatial weighting Image matching method, the spatial distribution of pixel and abnormal point are lacked when solving to measure image repeated matching degree leads to matching not Accurate problem.
In order to achieve the above objectives, the present invention adopts the following technical scheme that:One kind is added based on localized variation detection and space The multiimage matching process of power, it is characterised in that:Including step:
Step 1:Image to be matched is inputted, by the match point in SIFT algorithm extraction image to be matched and its accordingly Location information;
Step 2:Piece image in selected image to be matched carries out projective transformation;
Step 3:After projective transformation, using between match point distance, abnormal point, projective transformation amplitude as measure repeat scheme Three indexs of the matching degree of picture;
Step 4:In conjunction with distance, abnormal point, the projective transformation amplitude between match point in step 3, establish to match point Confidence ceases similarity maximum and the smallest objective function of projective transformation amplitude;
Step 5:Projective transformation parameter is solved by least square method to objective function, substituting into objective function will be final Obtained target function value obtains matching result compared with threshold value.
A kind of multiimage matching process based on localized variation detection and spatial weighting above-mentioned, it is characterised in that:Institute It states in step 1, the match point and its corresponding location information in image to be matched is extracted by SIFT algorithm, obtain formula:
Wherein, n is the logarithm of total matching characteristic point, (xi, yi) withIndicate match point respectively in two images In position, i=1~n;P withRespectively indicate all matching dot position informations of image to be matched, P withBetween for one by one Corresponding relationship.
A kind of multiimage matching process based on localized variation detection and spatial weighting above-mentioned, it is characterised in that:Institute It states the piece image that step 2 is selected in image to be matched and carries out projective transformation, specially:
Wherein, a, b, c, d are projective transformation parameter,Indicate all matching dot position informations of the P after projective transformation, P Indicate all matching dot position informations in the piece image of image to be matched.
A kind of multiimage matching process based on localized variation detection and spatial weighting above-mentioned, it is characterised in that:Step Rapid three:After projective transformation, using between match point distance, abnormal point, projective transformation amplitude as measure multiimage matching degree Three indexs, specially:
Step 3.1:The distance between corresponding match point of image to be matched after calculating projective transformation:
Wherein,Images match point after expression projective transformationWith match point before projective transformationIt is European between correspondence Distance;
After being weighted in such a way that Euclidean distance is combined with two-dimensional Gaussian functionWithBetween Corresponding matching point Euclidean distance ei, specially:
Wherein, ωiIndicate the weight that obtains by two-dimensional Gaussian function, m and l respectively represent the row pixel number of image with Column pixel number;
Step 3.2:Detect the abnormal point between match point;
It introduces Q type clustering and looks for potential abnormal point, using the Euclidean distance of match point as sorting criterion, by match point It is divided into and fluctuates larger and fluctuate smaller two class, wherein fluctuating biggish match point is considered as potential abnormal point, especially by 0-1 variable fiIt indicates, such as formula (6):
Wherein, Z indicates to fluctuate the distance set of smaller match point,Indicate the distance set of the larger match point of fluctuation, i.e., The distance set of potential abnormal point;
Step 3.3:Projective transformation amplitude is detected, the amplitude g of projective transformation is indicated by formula (7):
G=a2+b2+c2+d2 (7)
Wherein, a, b, c, d are projective transformation parameter, and amplitude g is bigger, and the degree of projective transformation is bigger, and images match degree is got over It is low.
A kind of multiimage matching process based on localized variation detection and spatial weighting above-mentioned, it is characterised in that:Institute State step 4:In conjunction with the distance, abnormal point, projective transformation amplitude of match point in step 3, establish to match dot position information phase It is specially like degree maximum and the smallest objective function of projective transformation amplitude, objective function J:
Wherein, n is the logarithm of total matching characteristic point, and λ is correction factor.
A kind of multiimage matching process based on localized variation detection and spatial weighting above-mentioned, it is characterised in that:Institute Belong to step 5:Projective transformation parameter is solved by least square method to objective function, substituting into objective function will finally obtain Target function value matching result is obtained compared with threshold value, specially:
It is that objective function is solved with least square method with formula (8):
Partial derivative is asked to parameter a, b, c, d respectively, is indicated as shown in equation group (9):
Solving equations (9), can find out parameter a, b, c, d, as shown in formula (10):
Wherein,
The expression formula that the projective transformation parameter formula (10) solved is substituted into J in formula (8), will be by finally obtained target letter Numerical value is compared with threshold value and then judges the matching degree of image.
The device have the advantages that:Repetition figure based on localized variation detection and spatial weighting proposed by the invention As matching process, the redundant image of monitor supervision platform can be effectively detected, a large amount of memory space is saved, become by introducing projection Change the influence for simulating the tiny transformation of shooting angle to result, it is contemplated that pixel space is distributed the influence to result, is added simultaneously Influence of the abnormal point to matching result improves the reasonability and reliability of multiimage detection.
Detailed description of the invention
Fig. 1 is the flow diagram of the method for the present invention;
Fig. 2 is the implementation process schematic diagram of SIFT algorithm.
Specific embodiment
The invention will be further described below in conjunction with the accompanying drawings.Following embodiment is only used for clearly illustrating the present invention Technical solution, and not intended to limit the protection scope of the present invention.
As shown in Figure 1, a kind of multiimage matching process based on localized variation detection and spatial weighting, including following step Suddenly:
Step 1:Image to be matched is inputted, as shown in Fig. 2, extracting the match point in image to be matched by SIFT algorithm And its corresponding location information, as shown in formula (1),
Wherein, n is the logarithm of total matching characteristic point, (xi, yi) withIndicate match point respectively in two images In position, i=1~n;P withRespectively indicate all matching dot position informations of image to be matched, P withBetween for one by one Corresponding relationship;
Step 2:Piece image in selected image to be matched carries out projective transformation, simulates the slight change of shooting angle Influence to matching result;
Most of multiimages all inevitably will receive the influence of shooting angle, thus introduces projective transformation and goes to simulate Angular transformation influences matching result bring.Assuming that the projective transformation as shown in formula (2) exists, then it is to be matched after projective transformation The match point distribution of image is roughly the same, and the piece image specifically selected in image to be matched carries out following projective transformation:
Wherein, a, b, c, d are projective transformation parameter,Indicate all matching dot position informations of the P after projective transformation;
Step 3:After projective transformation, using between match point distance, abnormal point, projective transformation amplitude as measure repeat scheme Three indexs of the matching degree of picture, detailed process are:
Step 3.1:The distance between corresponding match point of image to be matched after calculating projective transformation;
After projective transformation, detecting projective transformation by the distance between match point whether there is, if illustrating apart from excessive Projective transformation appropriate is not present between image to be matched, i.e. matching degree is lower, characterizes in particular by Euclidean distance:
Wherein,Images match point after expression projective transformationWith match point before projective transformationBetween correspondence it is European away from From;
Most of image main information concentrates on intermediate region, and what most of surrounding pixel point embodied is background information, Influence to matching result is far away from intermediate region.Two-dimensional Gaussian function is introduced as a result, describes the spatial distribution of pixel to knot The center of two-dimensional Gaussian function, is corresponded to central pixel point (0.5m, 0.5l) (its of image by the important implementations that fruit influences In, m and l respectively represent the row pixel number and column pixel number of image) at, two-dimensional Gaussian function value is in spoke by center around It penetrates shape to successively decrease, final match point distance e is obtained in such a way that Euclidean distance is combined with two-dimensional Gaussian functioni, specially:
Wherein, eiAfter indicating weightingWithEuclidean distance between Corresponding matching point, ωiExpression passes through dimensional Gaussian The weight that function obtains, m and l respectively represent the row pixel number and column pixel number of image.
Step 3.2:Detect the abnormal point between match point;
Consider as a whole all match points apart from when often ignore the presence of abnormal point, therefore, introduce Q type cluster point Analysis look for potential abnormal point, using the Euclidean distance of match point as sorting criterion, by match point be divided into fluctuation it is larger and fluctuate compared with Small two class, wherein fluctuating biggish match point is considered as potential abnormal point, their distance value is most important to matching result, specifically Pass through 0-1 variable fiIt indicates, such as formula (6):
Wherein, Z indicates to fluctuate the distance set of smaller match point,Indicate the distance set of the biggish match point of fluctuation, The distance set of i.e. potential abnormal point;
Step 3.3:Projective transformation amplitude is detected, distance, the pixel when measuring image to be matched, in addition to considering match point Spatial distribution and abnormal point, it is also necessary to guarantee projective transformation amplitude in a certain range, lose it if projective transformation is excessive Existing meaning not can guarantee the similitude of image, and the amplitude g of projective transformation is indicated in particular by formula (7):
A=a2+b2+c2+d2 (7)
Wherein, a, b, c, d are projective transformation parameter, and amplitude g is bigger, and the degree of projective transformation is bigger, and images match degree is got over It is low;
Step 4:In conjunction with three indexs (distance, abnormal point, the projective transformation amplitude of match point) in step 3, establish To match dot position information similarity maximum and the smallest objective function of projective transformation amplitude, specially:
There are two the main sources for influencing matching result:The location information of match point and projective transformation, wherein match point Location information is measured by Euclidean distance, and the influence of the spatial distribution and abnormal point of pixel is added.Finally, with potential exception Point Gauss Weighted distance and come characterize match point location information influence.For image higher for matching degree, distance Small as far as possible it should just can guarantee that projective transformation appropriate exists, while projective transformation also should ensure that in the range of tolerable, Otherwise projective transformation will lose meaning, thus obtain objective function J:
Wherein, n is the logarithm of total matching characteristic point, and λ is correction factor, according to experimental results, selectes the value of λ Range is:0.1~10;
Step 5:Projective transformation parameter is solved by least square method to objective function, substitutes into objective function, it will be final Target function value obtains matching result compared with threshold value;
Estimation for projective transformation parameter is that objective function is solved with least square method with formula (8):
Partial derivative is asked to parameter a, b, c, d respectively, is indicated as shown in equation group (9):
Solving equations (9), can find out parameter a, b, c, d, as shown in formula (10):
Wherein,
The expression formula that the projective transformation parameter formula (10) solved is substituted into J in formula (8), by finally obtained objective function Value is compared with threshold value and then judges the matching degree of image.
To sum up, the present invention can simulate influence of the subtle shooting angle variation to result, it is contemplated that the space of pixel point Influence of the cloth to result, while influence of the abnormal point to matching result is added, there is higher robustness.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, without departing from the technical principles of the invention, several improvement and deformations can also be made, these improvement and deformations Also it should be regarded as protection scope of the present invention.

Claims (6)

1. a kind of multiimage matching process based on localized variation detection and spatial weighting, it is characterised in that:Including step:
Step 1:Image to be matched is inputted, match point and its corresponding position in image to be matched are extracted by SIFT algorithm Information;
Step 2:Piece image in selected image to be matched carries out projective transformation;
Step 3:After projective transformation, using between match point distance, abnormal point, projective transformation amplitude is as measuring multiimage Three indexs of matching degree;
Step 4:In conjunction with distance, abnormal point, the projective transformation amplitude between match point in step 3, establish with match point position letter Cease similarity maximum and the smallest objective function of projective transformation amplitude;
Step 5:Projective transformation parameter is solved by least square method to objective function, substituting into objective function will finally obtain Target function value matching result is obtained compared with threshold value.
2. a kind of multiimage matching process based on localized variation detection and spatial weighting according to claim 1, It is characterized in that:In the step 1, the match point and its corresponding location information in image to be matched are extracted by SIFT algorithm, Obtain formula:
Wherein, n is the logarithm of total matching characteristic point, (xi, yi) withIndicate the position of match point respectively in two images It sets, i=1~n;P withRespectively indicate all matching dot position informations of image to be matched, P withBetween for correspond close System.
3. a kind of multiimage matching process based on localized variation detection and spatial weighting according to claim 1, It is characterized in that:The step 2 selectes the piece image in image to be matched and carries out projective transformation, specially:
Wherein, a, b, c, d are projective transformation parameter,Indicate that all matching dot position informations of the P after projective transformation, P indicate All matching dot position informations in the piece image of image to be matched.
4. a kind of multiimage matching process based on localized variation detection and spatial weighting according to claim 3, It is characterized in that:Step 3:After projective transformation, using between match point distance, abnormal point, projective transformation amplitude as measure repeat scheme Three indexs of the matching degree of picture, specially:
Step 3.1:The distance between corresponding match point of image to be matched after calculating projective transformation:
Wherein,Images match point after expression projective transformationWith match point before projective transformationEuclidean distance between correspondence;
After being weighted in such a way that Euclidean distance is combined with two-dimensional Gaussian functionWithEurope between Corresponding matching point Formula distance ei, specially:
Wherein, ωiIndicate the weight obtained by two-dimensional Gaussian function, m and l respectively represent the row pixel number and column picture of image Vegetarian refreshments number;
Step 3.2:Detect the abnormal point between match point;
It introduces Q type clustering and looks for potential abnormal point, using the Euclidean distance of match point as sorting criterion, match point is divided into It fluctuates larger and fluctuates smaller two class, wherein fluctuating biggish match point is considered as potential abnormal point, especially by 0-1 variable fiTable Show, such as formula (6):
Wherein, Z indicates to fluctuate the distance set of smaller match point,Indicate the distance set of the larger match point of fluctuation, i.e., it is potential The distance set of abnormal point;
Step 3.3:Projective transformation amplitude is detected, the amplitude g of projective transformation is indicated by formula (7):
G=a2+b2+c2+d2 (7)
Wherein, a, b, c, d are projective transformation parameter, and amplitude g is bigger, and the degree of projective transformation is bigger, and images match degree is lower.
5. a kind of multiimage matching process based on localized variation detection and spatial weighting according to claim 4, It is characterized in that:The step 4:In conjunction with the distance, abnormal point, projective transformation amplitude of match point in step 3, establish with match point Location information similarity maximum and the smallest objective function of projective transformation amplitude, objective function J are specially:
Wherein, n is the logarithm of total matching characteristic point, and λ is correction factor.
6. a kind of multiimage matching process based on localized variation detection and spatial weighting according to claim 5, It is characterized in that:Belonging step 5:Projective transformation parameter is solved by least square method to objective function, substitutes into objective function Finally obtained target function value is compared with threshold value and obtains matching result, specially:
It is that objective function is solved with least square method with formula (8):
Partial derivative is asked to parameter a, b, c, d respectively, is indicated as shown in equation group (9):
Solving equations (9), can find out parameter a, b, c, d, as shown in formula (10):
Wherein,
By the projective transformation parameter formula (10) solved substitute into formula (8) in J expression formula, by finally obtained target function value with Threshold value is compared and then judges the matching degree of image.
CN201810527166.7A 2018-05-29 2018-05-29 Repeated image matching method based on local change detection and spatial weighting Active CN108830281B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810527166.7A CN108830281B (en) 2018-05-29 2018-05-29 Repeated image matching method based on local change detection and spatial weighting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810527166.7A CN108830281B (en) 2018-05-29 2018-05-29 Repeated image matching method based on local change detection and spatial weighting

Publications (2)

Publication Number Publication Date
CN108830281A true CN108830281A (en) 2018-11-16
CN108830281B CN108830281B (en) 2021-09-28

Family

ID=64145888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810527166.7A Active CN108830281B (en) 2018-05-29 2018-05-29 Repeated image matching method based on local change detection and spatial weighting

Country Status (1)

Country Link
CN (1) CN108830281B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832910A (en) * 2020-06-24 2020-10-27 陕西法士特齿轮有限责任公司 Method and system for determining multi-index abnormal sound judgment threshold value and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101354796A (en) * 2008-09-05 2009-01-28 浙江大学 Omnidirectional stereo vision three-dimensional rebuilding method based on Taylor series model
CN102968777A (en) * 2012-11-20 2013-03-13 河海大学 Image stitching method based on overlapping region scale-invariant feather transform (SIFT) feature points
US20140064554A1 (en) * 2011-11-14 2014-03-06 San Diego State University Research Foundation Image station matching, preprocessing, spatial registration and change detection with multi-temporal remotely-sensed imagery
CN105654421A (en) * 2015-12-21 2016-06-08 西安电子科技大学 Projection transform image matching method based on transform invariant low-rank texture

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101354796A (en) * 2008-09-05 2009-01-28 浙江大学 Omnidirectional stereo vision three-dimensional rebuilding method based on Taylor series model
US20140064554A1 (en) * 2011-11-14 2014-03-06 San Diego State University Research Foundation Image station matching, preprocessing, spatial registration and change detection with multi-temporal remotely-sensed imagery
CN102968777A (en) * 2012-11-20 2013-03-13 河海大学 Image stitching method based on overlapping region scale-invariant feather transform (SIFT) feature points
CN105654421A (en) * 2015-12-21 2016-06-08 西安电子科技大学 Projection transform image matching method based on transform invariant low-rank texture

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姚国标等: "融合互补仿射不变特征的倾斜立体影像高精度自动配准方法", 《测绘学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832910A (en) * 2020-06-24 2020-10-27 陕西法士特齿轮有限责任公司 Method and system for determining multi-index abnormal sound judgment threshold value and computer equipment
CN111832910B (en) * 2020-06-24 2024-03-12 陕西法士特齿轮有限责任公司 Multi-index abnormal sound judgment threshold value determining method, system and computer equipment

Also Published As

Publication number Publication date
CN108830281B (en) 2021-09-28

Similar Documents

Publication Publication Date Title
CN103035013B (en) A kind of precise motion shadow detection method based on multi-feature fusion
CN103632158B (en) Forest fire prevention monitor method and forest fire prevention monitor system
CN104809463B (en) A kind of high-precision fire disaster flame detection method for converting dictionary learning based on intensive scale invariant feature
CN110148162A (en) A kind of heterologous image matching method based on composition operators
CN108765470A (en) One kind being directed to the improved KCF track algorithms of target occlusion
CN103544499B (en) The textural characteristics dimension reduction method that a kind of surface blemish based on machine vision is detected
CN105224921A (en) A kind of facial image preferentially system and disposal route
CN107579846B (en) Cloud computing fault data detection method and system
CN110580510B (en) Clustering result evaluation method and system
CN115272652A (en) Dense object image detection method based on multiple regression and adaptive focus loss
CN112488211A (en) Fabric image flaw classification method
CN109711267A (en) A kind of pedestrian identifies again, pedestrian movement's orbit generation method and device
CN112288758B (en) Infrared and visible light image registration method for power equipment
CN109509188A (en) A kind of transmission line of electricity typical defect recognition methods based on HOG feature
CN109472770A (en) A kind of image characteristic point Fast Match Algorithm in printed circuit board (PCB) detecting
CN103366373B (en) Multi-time-phase remote-sensing image change detection method based on fuzzy compatible chart
CN108830281A (en) A kind of multiimage matching process based on localized variation detection and spatial weighting
CN113191359A (en) Small sample target detection method and system based on support and query samples
CN109583307A (en) A kind of Cashmere and Woolens fiber recognition method based on local feature Yu word packet model
CN108986083A (en) SAR image change detection based on threshold optimization
CN106326927B (en) A kind of shoes print new category detection method
CN108257148A (en) The target of special object suggests window generation method and its application in target following
CN107886060A (en) Pedestrian's automatic detection and tracking based on video
CN108710886B (en) Repeated image matching method based on SIFT algorithm
KR101133225B1 (en) System and method for recognizing face using pose estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant