CN113222943B - Image deformation estimation method based on mixed grid transformation model - Google Patents

Image deformation estimation method based on mixed grid transformation model Download PDF

Info

Publication number
CN113222943B
CN113222943B CN202110539834.XA CN202110539834A CN113222943B CN 113222943 B CN113222943 B CN 113222943B CN 202110539834 A CN202110539834 A CN 202110539834A CN 113222943 B CN113222943 B CN 113222943B
Authority
CN
China
Prior art keywords
grid
matrix
characteristic point
transformation model
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110539834.XA
Other languages
Chinese (zh)
Other versions
CN113222943A (en
Inventor
杨宪强
张智浩
孙昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Intelligent Equipment Research Institute Co ltd
Original Assignee
Ningbo Intelligent Equipment Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Intelligent Equipment Research Institute Co ltd filed Critical Ningbo Intelligent Equipment Research Institute Co ltd
Priority to CN202110539834.XA priority Critical patent/CN113222943B/en
Publication of CN113222943A publication Critical patent/CN113222943A/en
Application granted granted Critical
Publication of CN113222943B publication Critical patent/CN113222943B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Abstract

An image deformation estimation method based on a hybrid grid transformation model belongs to the field of computer vision and image processing. The method solves the problem that the inter-image transformation model obtained by the existing estimation method cannot effectively solve the discontinuous deformation. Firstly, extracting and matching feature points of an image; then, carrying out primary classification and screening on the characteristic point pairs; and finally, estimating a hybrid grid transformation model. The method can accurately estimate the mixed grid transformation model and the component number of the mixed grid transformation model, and the estimated mixed grid transformation model can effectively solve the problem of discontinuous deformation. The invention can be used for estimating the image deformation.

Description

Image deformation estimation method based on mixed grid transformation model
Technical Field
The invention belongs to the field of computer vision and image processing, and particularly relates to an image deformation estimation method based on a hybrid grid transformation model.
Background
Image registration is a common problem in the field of computer vision and image processing, and can be applied to occasions such as image splicing, target positioning, three-dimensional reconstruction and the like. When an image of a real scene is shot, due to the fact that the distances between the positions of different objects and a camera are different, object displacement discontinuity exists in the images shot from different viewpoints, image registration operation needs to be conducted on the images shot from the different viewpoints, but the problem of discontinuous deformation cannot be effectively solved through an inter-image transformation model obtained through an existing estimation method.
Disclosure of Invention
The invention aims to solve the problem that a transformation model between images obtained by the existing estimation method cannot effectively solve the discontinuous deformation, and provides an image deformation estimation method based on a mixed grid transformation model.
The technical scheme adopted by the invention for solving the technical problems is as follows: an image deformation estimation method based on a hybrid mesh transformation model specifically comprises the following steps:
step one, respectively extracting reference images I of the same scene shot under different visual angles0And a target image I1And for reference image I0SIFT feature point and target image I1The SIFT feature points are matched to obtain an initial feature point pair set S0
Figure BDA0003071219330000011
Wherein the content of the first and second substances,
Figure BDA0003071219330000012
as the pair of initial characteristic points,
Figure BDA0003071219330000013
for reference picture I0The coordinates of the feature point in (2),
Figure BDA0003071219330000014
is a target image I1Characteristic point coordinates of (1), N0Is a set S0The number of the middle initial characteristic point pairs;
step two, classifying and screening the initial characteristic point pairs obtained in the step one, wherein the set of the characteristic point pairs obtained through screening is S,
Figure BDA0003071219330000015
(xi,yi) For pairs of characteristic points, x, obtained by screeningiFor reference picture I0Coordinates of the feature points in (1), yiIs a target image I1The coordinate of the characteristic point in the set S, N is the number of the characteristic point pairs in the set S, and the characteristic point coordinates in the set S are obtainedThe total number of categories of characteristic point pairs included in the set S is represented as K, and the category of the ith characteristic point pair is represented as c for each category of characteristic point pairi,ci=1,...,K;
Step three, for the reference image I0Carrying out grid division, wherein the width of each grid is w, and the height of each grid is h; taking a mixed grid transformation model consisting of K grid transformation models as a reference image I0And a target image I1Inter-geometric transformation models, each mesh transformation model constituting a component of the hybrid mesh transformation model;
wherein, the grid deformation parameter to be solved of each component is wk,k=1,2,…,K,wkIs the initial covariance matrix of
Figure BDA0003071219330000021
Figure BDA0003071219330000022
Is a matrix of0The inverse matrix of (d);
step four, respectively calculating each characteristic point xiCorresponding intermediate variable matrix phii
Initializing a responsibility matrix R, wherein the size of R is NxK, and the ith row and kth column elements in the responsibility matrix R are Ri,kIf k is equal to ciThen r isi,k1, otherwise ri,k0; initializing the iteration number n to be 0;
step six, for each component, utilizing phiiAnd R calculates wkMean value m of the posterior probability distribution ofkCovariance matrix
Figure BDA0003071219330000023
And a parameter alphak
Figure BDA0003071219330000024
Figure BDA0003071219330000025
Figure BDA0003071219330000026
Wherein the covariance matrix
Figure BDA0003071219330000027
Is ΛkInverse matrix of, beta and alpha0Is a constant;
step seven, utilizing mk
Figure BDA0003071219330000028
And alphakUpdating element values R in the responsibility matrix Ri,kThe updated element value is r'i,k
And step eight, updating the iteration number n to n +1 according to the element value r 'updated in the step seven'i,kComputing
Figure BDA0003071219330000029
Step nine, repeating the process from the step six to the step eight until n is more than 1 and
Figure BDA00030712193300000210
stopping iteration and obtaining m in the last iterationkAs wkAn estimate of (d).
The invention has the beneficial effects that: the invention provides an image deformation estimation method based on a hybrid grid transformation model, which comprises the steps of firstly extracting and matching characteristic points of an image; then, carrying out primary classification and screening on the characteristic point pairs; and finally, estimating a hybrid grid transformation model. The method can accurately estimate the mixed grid transformation model and the component number of the mixed grid transformation model, and the estimated mixed grid transformation model can effectively solve the problem of discontinuous deformation.
Drawings
FIG. 1 is a flow chart of an image deformation estimation method based on a hybrid mesh transformation model according to the present invention;
FIG. 2a) is the same scene graph one taken at a different viewpoint;
FIG. 2b) is the same scene graph two taken at a different viewpoint;
the characteristic points and grid lines of each component of the hybrid grid transformation model are represented in the figure.
Detailed Description
First embodiment this embodiment will be described with reference to fig. 1. The method for estimating image deformation based on a hybrid mesh transformation model according to the present embodiment specifically includes the following steps:
step one, respectively extracting reference images I of the same scene shot under different visual angles0And a target image I1And for reference image I0SIFT feature point and target image I1The SIFT feature points are matched to obtain an initial feature point pair set S0
Figure BDA0003071219330000031
Wherein the content of the first and second substances,
Figure BDA0003071219330000032
is the initial characteristic point pair, and the characteristic point pair,
Figure BDA0003071219330000033
for reference picture I0The coordinates of the feature points in (1),
Figure BDA0003071219330000036
is a target image I1Characteristic point coordinates of (1), N0Is a set S0The number of the middle initial characteristic point pairs;
SIFT is a commonly used feature point extraction and description algorithm, has the invariance of illumination, rotation and deformation to a certain degree, and is a commonly used method because the extracted feature points are stable and have high quality;
step two, classifying and screening the initial characteristic point pairs obtained in the step one, wherein the set of the characteristic point pairs obtained through screening isS,
Figure BDA0003071219330000034
(xi,yi) For pairs of characteristic points, x, obtained by screeningiFor reference picture I0Coordinates of the feature points in (1), yiIs a target image I1N is the number of characteristic point pairs in the set S, and the category of each characteristic point pair in the set S is obtained, the total number of categories of characteristic point pairs included in the set S is represented as K, and the category of the ith characteristic point pair is represented as ci,ci=1,...,K;
Step three, as shown in fig. 2a) and 2 b); for reference image I0Carrying out grid division, wherein the width of each grid is w, and the height of each grid is h; taking a mixed grid transformation model consisting of K grid transformation models as a reference image I0And a target image I1Inter-geometric transformation models, each mesh transformation model constituting a component of the hybrid mesh transformation model;
wherein, the grid deformation parameter to be solved of each component is wk,k=1,2,…,K,wkIs the initial covariance matrix of
Figure BDA0003071219330000035
Is a matrix of0The inverse matrix of (d);
step four, respectively calculating each characteristic point xiCorresponding intermediate variable matrix phiiPhi matrixiThe size of (a) is 2M × 2;
initializing a responsibility matrix R, wherein the size of R is NxK, and the ith row and kth column elements in the responsibility matrix R are Ri,kIf k is equal to ciThen r isi,k1, otherwise ri,k0; initializing the iteration number n to be 0;
step six, for each component, utilizing phiiAnd R calculates wkMean value m of the posterior probability distribution ofkCovariance matrix
Figure BDA0003071219330000041
And a parameter alphak
Figure BDA0003071219330000042
Figure BDA0003071219330000043
Figure BDA0003071219330000044
Wherein the covariance matrix
Figure BDA0003071219330000045
Is ΛkInverse matrix of, beta and alpha0Is a constant;
step seven, utilizing mk
Figure BDA0003071219330000046
And alphakUpdating element values R in the responsibility matrix Ri,kThe updated element value is r'i,k
And step eight, updating the iteration number n to n +1 according to the element value r 'updated in the step seven'i,kComputing
Figure BDA0003071219330000047
Step nine, repeating the process from the step six to the step eight until n is more than 1 and
Figure BDA0003071219330000048
stopping iteration and obtaining m in the last iterationkAs wkAn estimate of (d).
I.e. when n > 1 and all components are satisfied simultaneously
Figure BDA0003071219330000049
Then m obtained in the last iteration is addedkAs wkIs estimated value of. In the iteration process, the element value R in the updated responsibility matrix R obtained in the iteration is usedi,kFor the next iteration process.
Estimating the model by the expectation maximization iteration step has the advantages of avoiding overfitting and automatically determining the number of model components.
The second embodiment is as follows: the difference between this embodiment and the first embodiment is that, in the first step, the reference image I is searched based on the FLANN fast nearest neighbor search library0SIFT feature point and target image I1The SIFT feature points are matched.
The FLANN fast nearest neighbor search library can realize fast nearest neighbor matching of high-dimensional feature points.
The third concrete implementation mode: the difference between this embodiment and the second embodiment is that the method for classifying and screening the initial feature point pairs obtained in the first step is a RANSAC algorithm.
RANSAC is a parameterized model estimation algorithm and is characterized in that the method is robust to abnormal points and can obtain a correct model of effective sample data.
The fourth concrete implementation mode is as follows: the third difference between this embodiment and the specific embodiment is that the classification and screening of the initial feature point pairs obtained in step one includes the following specific processes:
step two, setting the current residual characteristic point pair set as S', and initializing the set as S0(ii) a Representing the characteristic point pair set obtained through screening as S, and initializing the set S into an empty set; initializing the serial number of the current category as c' ═ 1;
secondly, extracting interior points from the set S' by adopting RANSAC algorithm to obtain an interior point set Sc′Setting an inner point distance threshold value as Q and iteration times as L;
step two and three, if Sc′If the number of the characteristic point pairs is more than or equal to 10, updating S to S + Sc′And will aggregate Sc′Setting the class number of the middle characteristic point pair to be c ' and updating S ' ═ S ' -Sc′,c′=c′+1;
The second step four,Repeatedly executing the second step and the third step until S appearsc′If the number of the characteristic point pairs is less than 10, the process is ended.
The fifth concrete implementation mode: the fourth difference between this embodiment and the fourth embodiment is that the model selected by the RANSAC algorithm is a homography matrix transformation model.
The sixth specific implementation mode: the fourth difference between this embodiment and the specific embodiment is that the value of the interior point distance threshold Q is 3, and the value of the iteration number L is 500.
The seventh embodiment: the sixth embodiment is different from the sixth embodiment in that the matrix Λ0The expression of (a) is:
Λ0=cUTU
wherein c is a constant, and the value is 5 in the invention; u is a sparse matrix of 2 Mx 2M size, M is for reference image I0The number of grid points obtained by grid division and the element U of the mth row and mth column in the sparse matrix Um,m1, mth row of U
Figure BDA0003071219330000051
Elements of a column
Figure BDA0003071219330000052
Figure BDA0003071219330000053
The number of the grid points connected to the mth grid point is the serial number of the grid point connected to the mth grid point (according to the actual situation, the number of the grid points connected to the mth grid point may be 2,3, or 4), and the remaining elements in U are all 0; the superscript T stands for transpose.
The specific implementation mode is eight: the difference between this embodiment and the seventh embodiment is that the specific process of step four is as follows:
characteristic point xiThe sequence numbers of 4 grid points of the grid are
Figure BDA0003071219330000054
The bilinear interpolation weights of 1,2,3,4 and 4 grid points are respectively
Figure BDA0003071219330000055
Then the intermediate variable matrix phiiTo (1) a
Figure BDA0003071219330000056
The row column 1 element has a value of
Figure BDA0003071219330000057
First, the
Figure BDA0003071219330000058
Row column 2 element has a value of
Figure BDA0003071219330000059
The values of the remaining elements are all 0.
In the present invention, the numbers of the grid points may be arranged in any manner, but it is only required that the grid point numbers in this embodiment and the grid point numbers in the seventh embodiment are obtained in the same arrangement manner.
The specific implementation method nine: the present embodiment is different from the first embodiment in that the value of the constant β is 0.05.
The detailed implementation mode is ten: in this embodiment, the constant α is different from the first embodiment0Is 0.01.
The concrete implementation mode eleven: the difference between this embodiment and the eighth embodiment is that the specific process of the seventh step is as follows:
Figure BDA0003071219330000061
where ρ isi,kIs an intermediate variable;
Figure BDA0003071219330000062
where ψ (-) is a double gamma function and tr (-) is the trace of the matrix.
The above-described calculation examples of the present invention are merely to explain the calculation model and the calculation flow of the present invention in detail, and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications of the present invention can be made based on the above description, and it is not intended to be exhaustive or to limit the invention to the precise form disclosed, and all such modifications and variations are possible and contemplated as falling within the scope of the invention.

Claims (9)

1. An image deformation estimation method based on a hybrid mesh transformation model is characterized by specifically comprising the following steps:
step one, respectively extracting reference images I of the same scene shot under different visual angles0And a target image I1And for reference image I0SIFT feature point and target image I1The SIFT feature points are matched to obtain an initial feature point pair set S0
Figure FDA0003517128270000011
Wherein the content of the first and second substances,
Figure FDA0003517128270000012
as the pair of initial characteristic points,
Figure FDA0003517128270000013
for reference picture I0The coordinates of the feature points in (1),
Figure FDA0003517128270000014
is a target image I1Characteristic point coordinates of (1), N0Is a set S0The number of the middle initial characteristic point pairs;
step two, classifying and screening the initial characteristic point pairs obtained in the step one, wherein the set of the characteristic point pairs obtained through screening is S,
Figure FDA0003517128270000015
(xi,yi) For pairs of characteristic points, x, obtained by screeningiFor reference picture I0Coordinates of the feature points in (1), yiIs a target image I1N is the number of characteristic point pairs in the set S, and the category of each characteristic point pair in the set S is obtained, the total number of categories of characteristic point pairs included in the set S is represented as K, and the category of the ith characteristic point pair is represented as ci,ci=1,...,K;
Step three, for the reference image I0Carrying out grid division, wherein the width of each grid is w, and the height of each grid is h; taking a mixed grid transformation model consisting of K grid transformation models as a reference image I0And a target image I1Inter-geometric transformation models, each mesh transformation model constituting a component of the hybrid mesh transformation model;
wherein, the grid deformation parameter to be solved of each component is wk,k=1,2,…,K,wkIs the initial covariance matrix of
Figure FDA0003517128270000016
Figure FDA00035171282700000113
Is a matrix of0The inverse matrix of (d);
step four, respectively calculating each characteristic point xiCorresponding intermediate variable matrix phii
The specific process of the step four is as follows:
characteristic point xiThe sequence numbers of the 4 grid points of the grid are
Figure FDA0003517128270000017
The bilinear interpolation weights of the 4 grid points are respectively
Figure FDA0003517128270000018
Then the intermediate variable matrix phiiTo (1) a
Figure FDA0003517128270000019
The row column 1 element has a value of
Figure FDA00035171282700000110
First, the
Figure FDA00035171282700000111
Row column 2 element has a value of
Figure FDA00035171282700000112
The values of the other elements are all 0;
initializing a responsibility matrix R, wherein the size of R is NxK, and the ith row and kth column elements in the responsibility matrix R are Ri,kIf k is equal to ciThen r isi,k1, otherwise ri,k0; initializing the iteration number n to be 0;
step six, for each component, utilizing phiiAnd R calculates wkMean value m of the posterior probability distribution ofkCovariance matrix
Figure FDA0003517128270000021
And a parameter alphak
Figure FDA0003517128270000022
Figure FDA0003517128270000023
Figure FDA0003517128270000024
Wherein the covariance matrix
Figure FDA0003517128270000025
Is ΛkInverse matrix of, beta and alpha0Is a constant;
step seven, utilizing mk
Figure FDA0003517128270000026
And alphakUpdating element values R in the responsibility matrix Ri,kWhere the updated element value is denoted by r'i,k
The concrete process of the seventh step is as follows:
Figure FDA0003517128270000027
where ρ isi,kIs an intermediate variable;
Figure FDA0003517128270000028
wherein psi (-) is a double gamma function, and tr (-) is the trace of the matrix;
and step eight, updating the iteration number n to n +1 according to the element value r 'updated in the step seven'i,kComputing
Figure FDA0003517128270000029
Step nine, repeating the process from the step six to the step eight until n is more than 1 and
Figure FDA00035171282700000210
stopping iteration and obtaining m in the last iterationkAs wkAn estimate of (d).
2. The method according to claim 1, wherein in the first step, the reference image I is estimated based on a FLANN fast nearest neighbor search library0SIFT feature point and target image I1The SIFT feature points are matched.
3. The method for estimating image deformation based on hybrid mesh transform model as claimed in claim 2, wherein the initial feature point pairs obtained in the first step are classified and screened by using RANSAC algorithm.
4. The method according to claim 3, wherein the classifying and screening of the initial feature point pairs obtained in the first step comprises:
step two, setting the current residual feature point pair set as S', and initializing the feature point pair set as S0(ii) a Representing the characteristic point pair set obtained through screening as S, and initializing the set S into an empty set; initializing the serial number of the current category as c' ═ 1;
secondly, extracting interior points from the set S' by adopting RANSAC algorithm to obtain an interior point set Sc′Setting an inner point distance threshold value as Q and iteration times as L;
step two and step three, if Sc′If the number of the characteristic point pairs is more than or equal to 10, updating S to S + Sc′And will aggregate Sc′Setting the class number of the middle characteristic point pair to be c ' and updating S ' ═ S ' -Sc′,c′=c′+1;
Step two and step four, repeatedly executing step two and step three until S appearsc′If the number of the characteristic point pairs is less than 10, the process is ended.
5. The method for image deformation estimation based on hybrid grid transformation model as claimed in claim 4, wherein the RANSAC algorithm uses a homography matrix transformation model to classify and screen the initial feature point pairs.
6. The image deformation estimation method based on the hybrid mesh transformation model according to claim 4, wherein the value of the interior point distance threshold Q is 3, and the value of the iteration number L is 500.
7. The image deformation estimation method based on the hybrid mesh transformation model as claimed in claim 6, wherein the matrix Λ is0The expression of (a) is:
Λ0=cUTU
wherein c is a constant; u is a sparse matrix of 2 Mx 2M size, M is for reference image I0The number of grid points obtained by grid division and the element U of the mth row and mth column in the sparse matrix Um,m1, mth row of U
Figure FDA0003517128270000031
Elements of a column
Figure FDA0003517128270000032
Figure FDA0003517128270000033
The number of the grid point connected with the mth grid point is the serial number of the grid point, and the rest elements in the U are 0; the superscript T stands for transpose.
8. The image deformation estimation method based on the hybrid mesh transform model according to claim 1, wherein the value of the constant β is 0.05.
9. The method of claim 1, wherein the constant α is the same as the constant α0Is 0.01.
CN202110539834.XA 2021-05-18 2021-05-18 Image deformation estimation method based on mixed grid transformation model Active CN113222943B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110539834.XA CN113222943B (en) 2021-05-18 2021-05-18 Image deformation estimation method based on mixed grid transformation model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110539834.XA CN113222943B (en) 2021-05-18 2021-05-18 Image deformation estimation method based on mixed grid transformation model

Publications (2)

Publication Number Publication Date
CN113222943A CN113222943A (en) 2021-08-06
CN113222943B true CN113222943B (en) 2022-05-03

Family

ID=77092573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110539834.XA Active CN113222943B (en) 2021-05-18 2021-05-18 Image deformation estimation method based on mixed grid transformation model

Country Status (1)

Country Link
CN (1) CN113222943B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5651075A (en) * 1993-12-01 1997-07-22 Hughes Missile Systems Company Automated license plate locator and reader including perspective distortion correction
CN101221658A (en) * 2007-12-20 2008-07-16 四川川大智胜软件股份有限公司 Cylinder frame buffer texture re-labeling geometric correction method based on software
CN101572828A (en) * 2009-05-20 2009-11-04 长春理工大学 Method for correcting distortion in real time based on GPU camera and video camera
CN102194212A (en) * 2010-03-08 2011-09-21 佳能株式会社 Image processing method, device and system
CN103729849A (en) * 2013-12-31 2014-04-16 南京航空航天大学 Method for calculating digital image morphing initial value
CN103729841A (en) * 2013-12-18 2014-04-16 同济大学 Camera distortion correcting method based on square target model and perspective projection
CN104574363A (en) * 2014-12-12 2015-04-29 南京邮电大学 Full reference image quality assessment method in consideration of gradient direction difference
CN104994283A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Correction method for local distortion and mobile terminal
CN108780586A (en) * 2016-03-09 2018-11-09 株式会社理光 Image processing method, display equipment and inspection system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2016266019A1 (en) * 2016-11-30 2018-06-14 Canon Kabushiki Kaisha Image registration method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5651075A (en) * 1993-12-01 1997-07-22 Hughes Missile Systems Company Automated license plate locator and reader including perspective distortion correction
CN101221658A (en) * 2007-12-20 2008-07-16 四川川大智胜软件股份有限公司 Cylinder frame buffer texture re-labeling geometric correction method based on software
CN101572828A (en) * 2009-05-20 2009-11-04 长春理工大学 Method for correcting distortion in real time based on GPU camera and video camera
CN102194212A (en) * 2010-03-08 2011-09-21 佳能株式会社 Image processing method, device and system
CN103729841A (en) * 2013-12-18 2014-04-16 同济大学 Camera distortion correcting method based on square target model and perspective projection
CN103729849A (en) * 2013-12-31 2014-04-16 南京航空航天大学 Method for calculating digital image morphing initial value
CN104574363A (en) * 2014-12-12 2015-04-29 南京邮电大学 Full reference image quality assessment method in consideration of gradient direction difference
CN104994283A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Correction method for local distortion and mobile terminal
CN108780586A (en) * 2016-03-09 2018-11-09 株式会社理光 Image processing method, display equipment and inspection system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Distorted Building Image Matching with Automatic Viewpoint Rectification and Fusion;Linwei Yue 等;《MDPI》;20191127;全文 *
变形图驱动的人体与脸部网格模板拟合系统;黎子聪 等;《计算机辅助设计与图形学学报》;20190731;全文 *
基于图像块分配的鲁棒模板匹配方法;张智浩 等;《2020中国自动化大会(CA2020)论文集》;20210131;全文 *

Also Published As

Publication number Publication date
CN113222943A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
CN108038435B (en) Feature extraction and target tracking method based on convolutional neural network
JP6395555B2 (en) How to process video captured from a scene
CN112328715B (en) Visual positioning method, training method of related model, related device and equipment
CN107506792B (en) Semi-supervised salient object detection method
CN111768415A (en) Image instance segmentation method without quantization pooling
CN110490894B (en) Video foreground and background separation method based on improved low-rank sparse decomposition
Temel et al. CSV: Image quality assessment based on color, structure, and visual system
CN113177592B (en) Image segmentation method and device, computer equipment and storage medium
CN112215079B (en) Global multistage target tracking method
CN112084952B (en) Video point location tracking method based on self-supervision training
CN111127353B (en) High-dynamic image ghost-removing method based on block registration and matching
CN107392211B (en) Salient target detection method based on visual sparse cognition
CN112784747B (en) Multi-scale eigen decomposition method for hyperspectral remote sensing image
CN104680181B (en) SAR image superpixel segmentation method based on likelihood ratio feature
CN117036756B (en) Remote sensing image matching method and system based on variation automatic encoder
CN111161348A (en) Monocular camera-based object pose estimation method, device and equipment
CN110738695B (en) Image feature point mismatching and removing method based on local transformation model
CN113222943B (en) Image deformation estimation method based on mixed grid transformation model
CN110136164B (en) Method for removing dynamic background based on online transmission transformation and low-rank sparse matrix decomposition
CN111814884A (en) Target detection network model upgrading method based on deformable convolution
Zhou et al. Stn-homography: estimate homography parameters directly
CN108765384B (en) Significance detection method for joint manifold sequencing and improved convex hull
CN110570450A (en) Target tracking method based on cascade context-aware framework
WO2022239216A1 (en) Training device, training method, image processing device, and image processing method
CN113920391A (en) Target counting method based on generated scale self-adaptive true value graph

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant