CN106778915A - A kind of target matching method towards multiple-camera - Google Patents

A kind of target matching method towards multiple-camera Download PDF

Info

Publication number
CN106778915A
CN106778915A CN201710034514.2A CN201710034514A CN106778915A CN 106778915 A CN106778915 A CN 106778915A CN 201710034514 A CN201710034514 A CN 201710034514A CN 106778915 A CN106778915 A CN 106778915A
Authority
CN
China
Prior art keywords
image
target
fritter
designated
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710034514.2A
Other languages
Chinese (zh)
Inventor
向北海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Youxiang Technology Co Ltd
Original Assignee
Hunan Youxiang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Youxiang Technology Co Ltd filed Critical Hunan Youxiang Technology Co Ltd
Priority to CN201710034514.2A priority Critical patent/CN106778915A/en
Publication of CN106778915A publication Critical patent/CN106778915A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/467Encoded features or binary features, e.g. local binary patterns [LBP]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/759Region-based matching

Abstract

The present invention proposes a kind of target matching method towards multiple-camera, a candidate image set of blocks is found by a simple Gray-scale Matching first, then feature is carried out to image by the order characteristics comparison description of high precision to describe and calculate matching factor, eventually find the result of object matching.The inventive method considers two key elements of matching accuracy and time complexity simultaneously, good matching effect can be still reached on the basis of it need not expend the too many time, it is adaptable to various video image processing systems.

Description

A kind of target matching method towards multiple-camera
Technical field
The invention belongs to the technical field of image procossing, video monitoring and computer vision, refer in particular to a kind of towards many shootings The target matching method of machine.
Background technology
With a series of proposition of intelligent concepts such as intelligent transportation, safe city, intelligent Video Surveillance Technology turns into meter One emerging application direction of calculation machine visual field.Wherein the target detection based on single camera is tracked in some small range scenes In have been obtained for good application.But single camera field range is extremely limited, it is impossible to multi-angle, target entered for a long time Row treatment.Multiple video cameras have the advantages of monitoring range is big, and observation angle is wide, it is possible to using multiple information increasing system Robustness, thus it is more and more wider in the application of field of video monitoring.
But the use of multiple video cameras also brings along a series of new problems, including object matching between multiple-camera, Target relay tracking, data fusion etc..Wherein, the object matching between multiple-camera refers to be found in different image sequences Corresponding relation between target, its result directly affects follow-up treatment, therefore it is most important and basis in multiple-camera application One of problem.
In recent years, various countries researcher proposes certain methods to attempt solving this problem successively.These methods are general Can be divided into:Method based on region and the method based on point.Target is considered as region by the method based on region, using the face in region Color correlated characteristic is matched in various visual angles.But the influence of this category feature light line is larger, causes matching error larger.It is based on The method of point is that clarification of objective point is matched according to geometrical constraint, so as to realize the object matching of multiple-camera.But this Class method needs more priori to be modeled, and process is more complicated, it is impossible to meet actual demand.
The content of the invention
For the deficiency of existing tracking, the object of the present invention is to propose a kind of object matching towards multiple-camera Method.
The technical scheme is that:
A kind of target matching method towards multiple-camera, comprises the following steps:
S1 is synchronously shot by two video cameras to target, obtains two width video image P1 (x, y) and P2 of target (x, y), is then marked in piece image P1 (x, y) to target area, and we are complete by way of specifying by hand here Into.A rectangular image of the selection comprising target, is designated as target image A (x, y) by hand, and rectangle size is a × b.
S2 is the window of a × b using a size, in the enterprising line slip search of image P2 (x, y), is looked for by Gray-scale Matching To a candidate image set of blocks;
During slip scan, window is often slided once, and obtained on image P2 (x, y) a size is the image of a × b Block, is designated as B (x, y), calculates the gray scale difference value SAD (B) of B (x, y) and target image A (x, y):
SAD (B)=∑ | A (x, y)-B (x, y) | (1)
Be can see according to this definition, gray scale difference value is smaller, illustrate that both matching degrees are higher.Window sliding has been searched for Bi Hou, is ranked up to all of gray scale difference value, and the minimum preceding 100 corresponding image blocks of selection numerical value are used as candidate image block Set, is designated as { Bi(x, y) | i=1 ..., 100 }.
S3 carries out, based on feature description is compared in order, obtaining a characteristic vector to target image A (x, y);
S3.1 carries out piecemeal to target image A (x, y), and target image A (x, y) is divided into several image fritters, Each image block sizes is 3 × 3, then can obtain number isImage fritter, be designated as { k1(x,y), k2(x,y),…,ksum(x,y)}。
S3.2 carries out comparing coding in order to the image fritter of all of 3 × 3 on target image A (x, y);
To any 3 × 3 image fritter carry out in order compare coding method be:By measure image fritter and its around The orderly comparison relation of average gray between eight rectangular areas of equally distributed equal size, and with binary coding this Eight orderly comparison relations are concatenated into description, obtain a characteristic value.The orderly volume for comparing feature proposed by the present invention Code utilizing thoughts local binary description, but comparison between rectangular area more stablize than comparison between pixel can Lean on, because calculating average gray namely mean filter operation, have good inhibiting effect to picture noise.
A. for any image fritter k on target image A (x, y)j(x, y), calculates its gray averageWherein Ω represents image fritter kj(x, y) all of pixel set.
B. with image fritter kjCentered on (x, y), the size of equally distributed eight is 3 × 3 rectangle region around selection Domain:Note kjThe center pixel position of (x, y) is (x0,y0), then eight sizes are the middle imago of 3 × 3 rectangular area around it Element is respectively (x0-6,y0)、(x0-3,y0+3)、(x0,y0+6)、(x0+3,y0+3)、(x0+6,y0)、(x0+3,y0-3)、(x0,y0- 6)、(x0-3,y0- 3) this eight gray averages of rectangular area, are calculated in sequence, are designated as { μ12,…,μ8}。
C. feature is compared in order with image fritter kjThe gray average μ of (x, y) is threshold value, respectively by eight rectangle regions of surrounding Gray average { the μ in domain12,…,μ8Be compared with μ, otherwise it is 0 labeled as 1 if being more than μ.So by comparing and can produce Raw 8 bits, are converted into the value obtained by decimal number and obtain image fritter kjThe orderly comparison characteristic value of (x, y) YXT(kj).Its computing formula is:
Wherein
D. all of image fritter on image A (x, y) is carried out according to the method in step a to c comparing coding in order, most The characteristic vector that a length is sum can be obtained eventually, be designated as YXT_A.
S4 is carried out based on the description of comparison feature, each candidate in order to each the candidate image block in candidate image set of blocks Image block can obtain a characteristic vector;
According to method same in step S3 to candidate image set of blocks { Bi(x, y) | i=1 ..., 100 } inner each time Selecting image block is carried out based on feature description is compared in order, and each candidate image block can obtain the characteristic vector that a length is sum, It is designated as { YXT_Bi| i=1 ..., 100 }.
S5 calculates the distance of target image and candidate image by characteristic vector, selection it is minimum as best match knot Really.
With any candidate image BiCalculating process, B are illustrated as a example by (x, y)iThe characteristic vector of (x, y) is YXT_Bi, target figure As the characteristic vector of A (x, y) is YXT_A, both difference is calculated namely apart from dist (B using Euclidean distancei):
To all of candidate image all using the above method calculate and target image distance after, be ranked up, find away from From minimum candidate image as best matching result.
The present invention finds a candidate image set of blocks by a simple Gray-scale Matching first, then by high precision Order characteristics compare description and feature is carried out to image describe and calculate matching factor, eventually find the result of object matching.This Inventive method simultaneously consider two key elements of matching accuracy and time complexity, on the basis of it need not expend the too many time according to Good matching effect can so be reached, it is adaptable to various video image processing systems.
Brief description of the drawings
Fig. 1 is a kind of flow chart of target matching method towards multiple-camera of the invention;
Fig. 2 is ordered into comparing the template schematic diagram of feature.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing to embodiment party of the present invention Formula is described in further detail.
A kind of target matching method towards multiple-camera, comprises the following steps:
Synchronously target is shot by two video cameras, obtain target two width video image P1 (x, y) and P2 (x, Y), target area is marked in piece image P1 (x, y) then, we are completed by way of specifying by hand here. A rectangular image of the selection comprising target, is designated as target image A (x, y) by hand, and rectangle size is a × b.
In order to improve the accuracy of matching, it is necessary to carry out the feature description of complexity to image, but this also improves simultaneously The time complexity of matching.The present invention considers two key elements of matching accuracy and time complexity simultaneously, it is proposed that one kind distribution The target matching method of formula, finds a candidate image set of blocks, then by essence by a simple Gray-scale Matching first Degree order characteristics comparison description high carries out feature to image and describes and calculate matching factor, eventually finds the knot of object matching Really.Comprise the following steps that:
(1) a use of size is the window of a × b, in the enterprising line slip search of image P2 (x, y), by Gray-scale Matching Find a candidate image set of blocks;
Window is often slided once, obtained on image P2 (x, y) a size for a × b image block, be designated as B (x, Y), the gray scale difference value SAD (B) of B (x, y) and target image A (x, y) is calculated:
SAD (B)=∑ | A (x, y)-B (x, y) | (1)
Be can see according to this definition, gray scale difference value is smaller, illustrate that both matching degrees are higher.Window sliding has been searched for Bi Hou, is ranked up to all of gray scale difference value, and the minimum preceding 100 corresponding image blocks of selection numerical value are used as candidate image block Set, is designated as { Bi(x, y) | i=1 ..., 100 }.
(2) target image A (x, y) is carried out, based on feature description is compared in order, obtaining a characteristic vector;
S2.1 carries out piecemeal to target image A (x, y), and target image A (x, y) is divided into several image fritters, Each image block sizes is 3 × 3, then can obtain number isImage fritter, be designated as { k1(x,y), k2(x,y),…,ksum(x,y)}。
S2.2 carries out comparing coding in order to the image fritter of all of 3 × 3 on target image A (x, y);
To any 3 × 3 image fritter carry out in order compare coding method be:By measure image fritter and its around The orderly comparison relation of average gray between eight rectangular areas of equally distributed equal size, and with binary coding this Eight orderly comparison relations are concatenated into description, obtain a characteristic value.The orderly volume for comparing feature proposed by the present invention Code utilizing thoughts local binary description, but comparison between rectangular area more stablize than comparison between pixel can Lean on, because calculating average gray namely mean filter operation, have good inhibiting effect to picture noise.
Appoint below and take an image fritter kjThe coding key of orderly comparison is carried out as a example by (x, y).
A. for any image fritter k on target image A (x, y)j(x, y), calculates its gray averageWherein Ω represents image fritter kj(x, y) all of pixel set.
B. with image fritter kjCentered on (x, y), the size of equally distributed eight is 3 × 3 rectangle region around selection Domain:Note kjThe center pixel position of (x, y) is (x0,y0), then eight sizes are the middle imago of 3 × 3 rectangular area around it Element is respectively (x0-6,y0)、(x0-3,y0+3)、(x0,y0+6)、(x0+3,y0+3)、(x0+6,y0)、(x0+3,y0-3)、(x0,y0- 6)、(x0-3,y0- 3) this eight gray averages of rectangular area, are calculated in sequence, are designated as { μ12,…,μ8}。
C. feature is compared in order with image fritter kjThe gray average μ of (x, y) is threshold value, respectively by eight rectangle regions of surrounding Gray average { the μ in domain12,…,μ8Be compared with μ, otherwise it is 0 labeled as 1 if being more than μ.So by comparing and can produce Raw 8 bits, are converted into the value obtained by decimal number and obtain image fritter kjThe orderly comparison characteristic value of (x, y) YXT(kj).Its computing formula is:
WhereinAs shown in Fig. 2 each tile size is 3 × 3, the image block of label 1~8 As need and image fritter aiEight rectangular areas that (x, y) compares.
D. all of image fritter on target image A (x, y) is carried out to compare in order according to the method in step a to c and is compiled Code, may finally obtain the characteristic vector that a length is sum, be designated as YXT_A.
(3) according to method same in step (2) to candidate image set of blocks { Bi(x, y) | i=1 ..., 100 } it is inner every Individual candidate image block is carried out based on feature description is compared in order, and each candidate image block can obtain the feature that a length is sum Vector, is designated as { YXT_Bi| i=1 ..., 100 }.
(4) calculate the distance of target image and candidate image by characteristic vector, selection it is minimum as best match knot Really.
With any candidate image BiCalculating process, B are illustrated as a example by (x, y)iThe characteristic vector of (x, y) is YXT_Bi, target figure As the characteristic vector of A (x, y) is YXT_A, both difference is calculated namely apart from dist (B using Euclidean distancei):
All of candidate image is all calculated and the distance of target image after, be ranked up, find the minimum candidate of distance Image is used as best matching result.
The explanation of the preferred embodiment of the present invention contained above, this be in order to describe technical characteristic of the invention in detail, and Be not intended to be limited in the content of the invention in the concrete form described by embodiment, carry out according to present invention purport other Modification and modification are also protected by this patent.The purport of present invention is to be defined by the claims, rather than by embodiment Specific descriptions are defined.

Claims (6)

1. a kind of target matching method towards multiple-camera, it is characterised in that comprise the following steps:
S1 is synchronously shot by two video cameras to target, obtain target two width video image P1 (x, y) and P2 (x, Y), target area is marked in piece image P1 (x, y) then, by hand a rectangular image of the selection comprising target, Target image A (x, y) is designated as, rectangle size is a × b;
S2 is the window of a × b using a size, in the enterprising line slip search of image P2 (x, y), one is found by Gray-scale Matching Individual candidate image set of blocks;
S3 carries out, based on feature description is compared in order, obtaining a characteristic vector to target image A (x, y);
S4 is carried out based on the description of comparison feature, each candidate image in order to each the candidate image block in candidate image set of blocks Block can obtain a characteristic vector;
S5 calculates the distance of target image and candidate image by characteristic vector, selection it is minimum as best matching result.
2. the target matching method towards multiple-camera according to claim 1, it is characterised in that:In S2, candidate image The acquisition methods of set of blocks are:During slip scan, window is often slided once, and obtained on image P2 (x, y) a size is a The image block of × b, is designated as B (x, y), calculates the gray scale difference value SAD (B) of B (x, y) and target image A (x, y):
SAD (B)=∑ | A (x, y)-B (x, y) | (1)
After window sliding search is finished, all of gray scale difference value is ranked up, the minimum preceding 100 corresponding figures of selection numerical value As block is used as candidate image set of blocks, { B is designated asi(x, y) | i=1 ..., 100 }.
3. the target matching method towards multiple-camera according to claim 2, it is characterised in that:The implementation method of S3 It is:
S3.1 carries out piecemeal to target image A (x, y), and target image A (x, y) is divided into several image fritters, each Image block sizes are 3 × 3, then can obtain number isImage fritter, be designated as { k1(x,y),k2(x, y),…,ksum(x,y)};
S3.2 carries out comparing coding in order to the image fritter of all of 3 × 3 on target image A (x, y);
To any 3 × 3 image fritter carry out in order compare coding method be:By measure image fritter and its around it is uniform The orderly comparison relation of average gray between eight rectangular areas of the equal size of distribution, and with binary coding this eight Orderly comparison relation is concatenated into description, obtains a characteristic value;
All of image fritter on image A (x, y) is carried out as stated above in order compare coding, may finally obtain one it is long The characteristic vector for sum is spent, YXT_A is designated as.
4. the target matching method towards multiple-camera according to claim 3, it is characterised in that:S3.2 its implementation It is:
A. for any image fritter k on target image A (x, y)j(x, y), calculates its gray averageWherein Ω represents image fritter kj(x, y) all of pixel set;
B. with image fritter kjCentered on (x, y), the size of equally distributed eight is 3 × 3 rectangular area around selection:Note kjThe center pixel position of (x, y) is (x0,y0), then eight sizes are the center pixel point of 3 × 3 rectangular area around it Wei not (x0-6,y0)、(x0-3,y0+3)、(x0,y0+6)、(x0+3,y0+3)、(x0+6,y0)、(x0+3,y0-3)、(x0,y0-6)、 (x0-3,y0- 3) this eight gray averages of rectangular area, are calculated in sequence, are designated as { μ12,…,μ8};
C. feature is compared in order with image fritter kjThe gray average μ of (x, y) is threshold value, respectively by eight rectangular areas around Gray average { μ12,…,μ8Be compared with μ, otherwise it is 0 labeled as 1 if being more than μ;So through comparing and can produce 8 Binary number, is converted into the value obtained by decimal number and obtains image fritter kjThe orderly comparison characteristic value YXT (k of (x, y)j), Its computing formula is:
Y X T ( k j ) = Σ p = 1 8 s ( μ p - μ ) 2 p - - - ( 2 )
Wherein
D. all of image fritter on target image A (x, y) is carried out according to the method in step a to c comparing coding in order, most The characteristic vector that a length is sum can be obtained eventually, be designated as YXT_A.
5. the target matching method towards multiple-camera according to claim 3 or 4, it is characterised in that:In step S4, press According to method same in step S3 to candidate image set of blocks { Bi(x, y) | i=1 ..., 100 } each inner candidate image block enters Row can obtain the characteristic vector that a length is sum, be designated as { YXT_ based on feature description, each candidate image block is compared in order Bi| i=1 ..., 100 }.
6. the target matching method towards multiple-camera according to claim 5, it is characterised in that:The realization side of step S5 Method is:
For any candidate image Bi(x, y), BiThe characteristic vector of (x, y) is YXT_Bi, the characteristic vector of target image A (x, y) It is YXT_A, both difference is calculated namely apart from dist (B using Euclidean distancei):
d i s t ( B i ) = Σ n = 1 s u m ( Y X T _ A ( n ) - Y X T _ B i ( n ) ) 2 - - - ( 3 )
After the distance of above method calculating and target image is all used to all of candidate image, it is ranked up, finds distance most Small candidate image is used as best matching result.
CN201710034514.2A 2017-01-17 2017-01-17 A kind of target matching method towards multiple-camera Pending CN106778915A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710034514.2A CN106778915A (en) 2017-01-17 2017-01-17 A kind of target matching method towards multiple-camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710034514.2A CN106778915A (en) 2017-01-17 2017-01-17 A kind of target matching method towards multiple-camera

Publications (1)

Publication Number Publication Date
CN106778915A true CN106778915A (en) 2017-05-31

Family

ID=58947164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710034514.2A Pending CN106778915A (en) 2017-01-17 2017-01-17 A kind of target matching method towards multiple-camera

Country Status (1)

Country Link
CN (1) CN106778915A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112613568A (en) * 2020-12-29 2021-04-06 清华大学 Target identification method and device based on visible light and infrared multispectral image sequence
CN112907443A (en) * 2021-02-05 2021-06-04 深圳市优象计算技术有限公司 Video super-resolution reconstruction method and system for satellite camera

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105374010A (en) * 2015-09-22 2016-03-02 江苏省电力公司常州供电公司 A panoramic image generation method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105374010A (en) * 2015-09-22 2016-03-02 江苏省电力公司常州供电公司 A panoramic image generation method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ZHEN SUN等: ""Application of image retrieval based on the improved local binary pattern"", 《PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND NETWORKS》 *
廖胜才: ""基于在线学习的多摄像机目标跟踪"", 《中国科学院研究生院博士学位论文》 *
王蕾等: ""一种灰度与特征匹配相结合的混合算法"", 《2005年全国自动化新技术学术交流会会议论文集》 *
程远航: "《无人机航空遥感图像拼接技术研究》", 31 August 2016, 清华大学出版社 *
舒付祥等: ""一种基于灰度特征的图象匹配算法设计与研究"", 《计算机工程与应用》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112613568A (en) * 2020-12-29 2021-04-06 清华大学 Target identification method and device based on visible light and infrared multispectral image sequence
CN112907443A (en) * 2021-02-05 2021-06-04 深圳市优象计算技术有限公司 Video super-resolution reconstruction method and system for satellite camera
CN112907443B (en) * 2021-02-05 2023-06-16 深圳市优象计算技术有限公司 Video super-resolution reconstruction method and system for satellite camera

Similar Documents

Publication Publication Date Title
CN105184801B (en) It is a kind of based on multi-level tactful optics and SAR image high-precision method for registering
Su et al. Vanishing point constrained lane detection with a stereo camera
CN108960211B (en) Multi-target human body posture detection method and system
EP2724295B1 (en) System and method for identifying scale invariant features of object outlines on images
Li et al. Automatic crack detection and measurement of concrete structure using convolutional encoder-decoder network
CN110599537A (en) Mask R-CNN-based unmanned aerial vehicle image building area calculation method and system
CN108550166B (en) Spatial target image matching method
CN111652790B (en) Sub-pixel image registration method
CN107862319B (en) Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting
Pascoe et al. Robust direct visual localisation using normalised information distance.
CN102722887A (en) Image registration method and device
CN109308715A (en) A kind of optical imagery method for registering combined based on point feature and line feature
CN113312973B (en) Gesture recognition key point feature extraction method and system
CN104978742A (en) Image registration method and image registration device based on cascade structure
CN108229500A (en) A kind of SIFT Mismatching point scalping methods based on Function Fitting
Liu et al. Dynamic RGB-D SLAM based on static probability and observation number
CN111080631A (en) Fault positioning method and system for detecting floor defects of spliced images
CN108537832B (en) Image registration method and image processing system based on local invariant gray feature
CN111563896B (en) Image processing method for detecting abnormality of overhead line system
CN112364881B (en) Advanced sampling consistency image matching method
Zhu et al. HMFCA-Net: Hierarchical multi-frequency based Channel attention net for mobile phone surface defect detection
CN103743750A (en) Method for generating distribution diagram of surface damage of heavy calibre optical element
CN106778915A (en) A kind of target matching method towards multiple-camera
CN107392948B (en) Image registration method of amplitude-division real-time polarization imaging system
CN111160142B (en) Certificate bill positioning detection method based on numerical prediction regression model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170531

RJ01 Rejection of invention patent application after publication