CN102629330A - Rapid and high-precision matching method of depth image and color image - Google Patents

Rapid and high-precision matching method of depth image and color image Download PDF

Info

Publication number
CN102629330A
CN102629330A CN2012100500338A CN201210050033A CN102629330A CN 102629330 A CN102629330 A CN 102629330A CN 2012100500338 A CN2012100500338 A CN 2012100500338A CN 201210050033 A CN201210050033 A CN 201210050033A CN 102629330 A CN102629330 A CN 102629330A
Authority
CN
China
Prior art keywords
image
depth image
point
coloured image
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100500338A
Other languages
Chinese (zh)
Inventor
徐向民
苗捷
赵思敏
崔东顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN2012100500338A priority Critical patent/CN102629330A/en
Publication of CN102629330A publication Critical patent/CN102629330A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a rapid and high-precision matching method of a depth image and a color image. The method comprises the following steps: (1) Carrying out interframe smoothing and histogram equalization on the depth image shot by a depth camera; carrying out median filtering on the color image; (2) Carrying out multi-scale spatial Harris-Laplace corner detection on the depth image and the color image so as to obtain characteristic points; (3) Describing the detected characteristic points; (4) Using a KNN algorithm to match the characteristic points of the depth image and the color image so as to obtain candidate characteristic point matching pairs; (5) Using a RANSAC algorithm to eliminate the mismatched candidate characteristic point matching pairs so as to obtain the characteristic point matching pairs. The prior art is difficult to be used in the matching field of the depth image and an optical image. By using the method of the invention, the above problem can be solved. And robustness and accuracy can be increased.

Description

The quick high accuracy matching process of a kind of depth image and coloured image
Technical field
The present invention relates to technical field of image processing, the quick high accuracy matching process of particularly a kind of depth image and coloured image.
Background technology
Along with continuing to bring out of novel sensor, the ability that people obtain image improves rapidly, and the image that sensor produced of different qualities also is on the increase.Because the view data obtained without sensor has limitation and certain defective, only through a sub-picture or one type of view data often can not be comprehensive and deep understanding, such data also are difficult to reach tests and demands of applications.For this reason, need the image synthesis that different sensors obtains be got up to use through images match and integration technology, even constitute three-dimensional model, understand and grasp thereby we are had more clearly target.
Image matching technology has been widely used in scientific research and people's production and life as one of the key of image processing techniques and basic technology.Accept widely and use and also more and more receive based on the image matching technology of unique point, wherein SIFT and Harris use representative the most widely in this class operator.Yet along with system to real-time, the requirement on robustness and the speed is more and more harsher.Especially to the object identification based on unique point, on the real time algorithm of location and tracking, traditional SIFT can not reach people's requirement on the unique point operator speed such as SURF and Harris.
And, make the depth transducer number of applications in scientific research along with the emergence of this degree of depth camera of the Kinect of Microsoft, and the degree of depth picture of taking out based on depth transducer is fully different in essence at picture with the optics picture before our.And traditional matching process basically all is the coupling that is applied to optical imagery, and the coupling of few certain applications between depth image arranged.And on the problem of depth image and two kinds of complete picture couplings of different nature of optical imagery, also seldom can find the good matching process of a kind of rate of exchange.
Summary of the invention
In order to overcome the above-mentioned deficiency of prior art, the object of the present invention is to provide the quick high accuracy matching process of a kind of depth image and coloured image.
The object of the invention is realized through following technical scheme:
The quick high accuracy matching process of a kind of depth image and coloured image may further comprise the steps:
(1) depth image of degree of depth camera being taken out carries out the level and smooth and histogram equalization of interframe; Coloured image is carried out medium filtering;
(2) respectively depth image and coloured image are carried out the Harris-Laplace Corner Detection of multiscale space, obtain unique point;
(3) detected unique point is described;
(4) with the KNN algorithm unique point of depth image and coloured image is mated, it is right to obtain candidate feature point coupling;
(5) the candidate feature point coupling with RANSAC algorithm elimination matching error is right, and it is right to obtain Feature Points Matching.
The said Harris-Laplace Corner Detection of respectively depth image and coloured image being carried out multiscale space of step (2) obtains unique point, is specially:
(2-1) definition space yardstick;
(2-2) with the gaussian kernel convolution of depth image and each yardstick, obtain the metric space of depth image; With the gaussian kernel convolution of coloured image and each yardstick, obtain the metric space of coloured image;
(2-3) on each yardstick of depth image metric space and coloured image metric space, calculate second-order matrix respectively;
(2-4) according to second moment matrix computations angle point degree value;
The local maximum point of (2-5) asking for the angle point degree value is as the candidate feature point;
(2-6) the normalization LOG value of the angle point degree value of calculated candidate unique point;
(2-7) ask for the local maximum point of normalization LOG value on dimension, i.e. the Harris angle point of metric space as unique point.
Step (3) is said to be described the unique point that detects, and is specially:
(3-1), calculate gaussian image according to the scale-value of each unique point of depth image and coloured image;
(3-2) adopt finite difference method, calculate the argument and the amplitude of the image gradient in unique point zone in the gaussian image;
(3-3) argument and the amplitude of the image gradient in use statistics with histogram unique point zone, histogrammic peak value is represented the principal direction of this unique point;
(3-4) be divided into a plurality of direction scopes, add up the image gradient strength information of each direction, obtain the SIFT eigenvector of multidimensional 0 °~360 °.
Step (4) is said matees the unique point of depth image and coloured image with the KNN algorithm, and it is right to obtain candidate feature point coupling, is specially:
At first calculate the Euclidean distance between the proper vector of two all unique points of width of cloth image, then the proper vector of proper vector and its arest neighbors is that the preliminary matching characteristic vector of confirming is right; Pick out the nearest neighbor distance and time nearest neighbor distance of proper vector; And threshold value is set; When the ratio of two distances during less than threshold value, the then preliminary matching characteristic vector of confirming is to be that candidate's matching characteristic vector is right, and characteristic of correspondence point is that candidate's matched feature points is right.
The said definition space yardstick of step (2-1) is specially: definition fixed size σ 1, σ 2... σ n, σ wherein n=k nσ 0Defining integration yardstick σ I, the variable of expression Harris angle point current scale; Definition differential yardstick σ D, near the variable that differential value changes the expression Harris angle point; σ wherein D=s σ I, k, s confirm according to actual conditions.
Step (2-3) is said calculates second-order matrix on each yardstick of metric space, the formula of embodying is:
M = μ ( ω , σ I , σ D ) = σ D 2 g ( σ I ) ⊗ L x 2 ( ω , σ D ) L x L y ( ω , σ D ) L x L y ( ω , σ D ) L y 2 ( ω , σ D ) ;
Wherein, g (σ I) the expression yardstick is σ IGaussian convolution nuclear, the position of ω presentation video; Image after L (ω) expression process Gauss is level and smooth, symbol
Figure BDA0000139306790000032
The expression convolution, L x(ω, σ D) and L y(ω, σ D) represent respectively image is used Gauss g (σ D) function carry out level and smooth after, the result who gets its differential in x or y direction.
Step (2-4) is specially according to second moment matrix computations angle point degree value:
Angle point degree value HrF=det (μ (ω, σ I, σ D))-α trace 2(μ (ω, σ I, σ D)), α confirms that according to actual conditions det representes that determinant calculates, the straight mark of trace representing matrix, for example X = A B B C , Trace (X)=A+C;
The said local maximum point of normalization LOG value on dimension of asking for of step (2-7) is specially as unique point:
Make then when normalization LOG value meets the following conditions, its corresponding candidate feature point is a unique point:
μ(ω,σ n)>μ(w,σ n-1)
μ(ω,σ n)>μ(w,σ n+1)
μ(ω,σ n)>threshold L
The said interframe of step (1) smoothly is that the interframe between continuous 11 frames is level and smooth.
The said medium filtering of step (1) is 7 * 7 medium filtering.
Compared with prior art, the present invention has the following advantages and technique effect:
1, through depth image and coloured image being carried out the Harris-Laplace Corner Detection of multiscale space; Obtain unique point; Accelerated image matching algorithm speed, under the situation of affined transformation and change of scale existence, having improved robustness and degree of accuracy greatly based on unique point.
2, in advance depth image is carried out the level and smooth and histogram equalization of interframe; Coloured image is carried out medium filtering; Solved the problem that prior art is difficult to be applied to depth image and optical imagery coupling field, in image co-registration, remote sensing image processing; Pattern-recognition, computer vision field has wide practical use.
Description of drawings
Fig. 1 is the process flow diagram of the quick high accuracy matching process of depth image of the present invention and coloured image.
Embodiment
Below in conjunction with embodiment and accompanying drawing, the present invention is done to specify further, but embodiment of the present invention is not limited thereto.
Embodiment
As shown in Figure 1, the quick high accuracy matching process of depth image of the present invention and coloured image may further comprise the steps:
(1) depth image of degree of depth camera being taken out carries out the level and smooth and histogram equalization of interframe between continuous 11 frames; Coloured image carried out 7 * 7 medium filtering;
(2) respectively depth image and coloured image are carried out the Harris-Laplace Corner Detection of multiscale space, obtain unique point:
(2-1) definition space yardstick: definition fixed size σ 1, σ 2... σ n, σ wherein n=k nσ 0Defining integration yardstick σ I, the variable of expression Harris angle point current scale; Definition differential yardstick σ D, near the variable that differential value changes the expression Harris angle point; σ wherein D=s σ I, k=1.5; S=0.7;
(2-2) with the gaussian kernel convolution of depth image and each yardstick, obtain the metric space of depth image; With the gaussian kernel convolution of coloured image and each yardstick, obtain the metric space of coloured image;
(2-3) on each yardstick of metric space, calculate second-order matrix, the formula of embodying is:
M = μ ( ω , σ I , σ D ) = σ D 2 g ( σ I ) ⊗ L x 2 ( ω , σ D ) L x L y ( ω , σ D ) L x L y ( ω , σ D ) L y 2 ( ω , σ D ) ;
Wherein, g (σ I) the expression yardstick is σ IGaussian convolution nuclear, the position of ω presentation video; Image after L (ω) expression process Gauss is level and smooth, symbol The expression convolution, L x(ω, σ D) and L y(ω, σ D) represent respectively image is used Gauss g (σ D) function carry out level and smooth after, the result who gets its differential in x or y direction;
(2-4), be specially according to second moment matrix computations angle point degree value:
Angle point degree value HrF=det (μ (ω, σ I, σ D))-α trace 2(μ (ω, σ I, σ D)), α=0.05, det representes that determinant calculates, the straight mark of trace representing matrix, for example X = A B B C , Trace (X)=A+C;
The local maximum point of (2-5) asking for the angle point degree value is as the candidate feature point;
(2-6) the normalization LOG value of the angle point degree value of calculated candidate unique point,
Make
Figure BDA0000139306790000044
then when normalization LOG value meets the following conditions, its corresponding candidate feature point is a unique point:
μ(ω,σ n)>μ(w,σ n-1)
μ(ω,σ n)>μ(w,σ n+1)
μ(ω,σ n)>threshold L
(2-7) ask for the local maximum point of normalization LOG value on dimension, i.e. the Harris angle point of metric space as unique point.
(3) detected unique point is described, is specially:
(3-1), calculate gaussian image according to the scale-value of each unique point of depth image and coloured image;
(3-2) adopt finite difference method, calculate that to calculate with the unique point in the gaussian image be the center, 3 * 1.5 σ be radius the zone image gradient argument m (x, y) with amplitude θ (x, y), concrete computing formula is following:
M ( x , y ) = ( L ( x + 1 , y ) - L ( x - 1 , y ) ) 2 + ( L ( x , y + 1 ) - L ( x , y - 1 ) ) 2 ;
θ ( x , y ) = tan - 1 ( L ( x , y , + 1 ) - L ( x , y - 1 ) L ( x + 1 , y ) - L ( x - 1 , y ) ) .
(3-3) argument and the amplitude of the image gradient in use statistics with histogram unique point zone, histogram will be divided into 36 posts, per 10 ° of posts; Histogrammic peak value is represented the principal direction of this unique point.
(3-4) be divided into 8 direction scopes with 0 °~360 °, each scope is 45 °, and the image gradient strength information of 8 directions of statistics owing to there are 4 * 4 sub regions, so have 4 * 4 * 8=128 data, finally forms the SIFT eigenvector of 128 dimensions.
(4) with the KNN algorithm unique point of depth image and coloured image is mated; It is right to obtain candidate feature point coupling: at first calculate the Euclidean distance between the proper vector of two all unique points of width of cloth image, then the proper vector of proper vector and its arest neighbors be that the matching characteristic vector of tentatively affirmation is right; Pick out the nearest neighbor distance and time nearest neighbor distance of proper vector; And threshold value is set; When the ratio of two distances during less than threshold value, the then preliminary matching characteristic vector of confirming is to be that candidate's matching characteristic vector is right, and characteristic of correspondence point is that candidate's matched feature points is right.
(5) the candidate feature point coupling with RANSAC algorithm elimination matching error is right; It is right to obtain Feature Points Matching: the matched feature points of choosing a part converses the transition matrix between two width of cloth images as observation data, and through translation; Rotation and amplification make two width of cloth images match; Iteration through limited number of times finally selects one group of match point that meets reality most again, thereby eliminates the mistake match point, and it is right to obtain Feature Points Matching.
The foregoing description is a preferred implementation of the present invention; But embodiment of the present invention is not limited by the examples; Other any do not deviate from change, the modification done under spirit of the present invention and the principle, substitutes, combination, simplify; All should be the substitute mode of equivalence, be included within protection scope of the present invention.

Claims (10)

1. the quick high accuracy matching process of depth image and coloured image is characterized in that, may further comprise the steps:
(1) depth image of degree of depth camera being taken out carries out the level and smooth and histogram equalization of interframe; Coloured image is carried out medium filtering;
(2) respectively depth image and coloured image are carried out the Harris-Laplace Corner Detection of multiscale space, obtain unique point;
(3) detected unique point is described;
(4) with the KNN algorithm unique point of depth image and coloured image is mated, it is right to obtain candidate feature point coupling;
(5) the candidate feature point coupling with RANSAC algorithm elimination matching error is right, and it is right to obtain Feature Points Matching.
2. the quick high accuracy matching process of depth image according to claim 1 and coloured image; It is characterized in that; The said Harris-Laplace Corner Detection of respectively depth image and coloured image being carried out multiscale space of step (2) obtains unique point, is specially:
(2-1) definition space yardstick;
(2-2) with the gaussian kernel convolution of depth image and each yardstick, obtain the metric space of depth image; With the gaussian kernel convolution of coloured image and each yardstick, obtain the metric space of coloured image;
(2-3) on each yardstick of depth image metric space and coloured image metric space, calculate second-order matrix respectively;
(2-4) according to second moment matrix computations angle point degree value;
The local maximum point of (2-5) asking for the angle point degree value is as the candidate feature point;
(2-6) the normalization LOG value of the angle point degree value of calculated candidate unique point;
(2-7) ask for the local maximum point of normalization LOG value on dimension, i.e. the Harris angle point of metric space as unique point.
3. the quick high accuracy matching process of depth image according to claim 2 and coloured image is characterized in that, step (3) is said to be described the unique point that detects, and is specially:
(3-1), calculate gaussian image according to the scale-value of each unique point of depth image and coloured image;
(3-2) adopt finite difference method, calculate the argument and the amplitude of the image gradient in unique point zone in the gaussian image;
(3-3) argument and the amplitude of the image gradient in use statistics with histogram unique point zone, histogrammic peak value is represented the principal direction of this unique point;
(3-4) be divided into a plurality of direction scopes, add up the image gradient strength information of each direction, obtain the SIFT eigenvector of multidimensional 0 °~360 °.
4. the quick high accuracy matching process of depth image according to claim 2 and coloured image is characterized in that, step (4) is said matees the unique point of depth image and coloured image with the KNN algorithm, and it is right to obtain candidate feature point coupling, is specially:
At first calculate the Euclidean distance between the proper vector of two all unique points of width of cloth image, then the proper vector of proper vector and its arest neighbors is that the preliminary matching characteristic vector of confirming is right; Pick out the nearest neighbor distance and time nearest neighbor distance of proper vector; And threshold value is set; When the ratio of two distances during less than threshold value, the then preliminary matching characteristic vector of confirming is to be that candidate's matching characteristic vector is right, and characteristic of correspondence point is that candidate's matched feature points is right.
5. the quick high accuracy matching process of depth image according to claim 1 and coloured image is characterized in that, the said definition space yardstick of step (2-1) is specially: definition fixed size σ 1, σ 2... σ n, σ wherein n=k nσ 0Defining integration yardstick σ I, the variable of expression Harris angle point current scale; Definition differential yardstick σ D, near the variable that differential value changes the expression Harris angle point; σ wherein D=s σ I, k, s confirm according to actual conditions.
6. the quick high accuracy matching process of depth image according to claim 5 and coloured image is characterized in that, the said calculating second-order matrix of step (2-3), and the formula of embodying is:
M = μ ( ω , σ I , σ D ) = σ D 2 g ( σ I ) ⊗ L x 2 ( ω , σ D ) L x L y ( ω , σ D ) L x L y ( ω , σ D ) L y 2 ( ω , σ D )
Wherein, g (σ I) the expression yardstick is σ IGaussian convolution nuclear, the position of ω presentation video; Image after L (ω) expression process Gauss is level and smooth, symbol
Figure FDA0000139306780000022
The expression convolution, L x(ω, σ D) and L y(ω, σ D) represent respectively image is used Gauss g (σ D) function carry out level and smooth after, the result who gets its differential in x or y direction.
7. the quick high accuracy matching process of depth image according to claim 6 and coloured image is characterized in that, step (2-4) is specially according to second moment matrix computations angle point degree value:
Angle point degree value HrF=det (μ (ω, σ I, σ D))-α trace 2(μ (ω, σ I, σ D)) α confirms that according to actual conditions det representes that determinant calculates the straight mark of trace representing matrix.
8. the quick high accuracy matching process of depth image according to claim 7 and coloured image is characterized in that, the said local maximum point of normalization LOG value on dimension of asking for of step (2-7) is specially as unique point:
Make
Figure FDA0000139306780000023
then when normalization LOG value meets the following conditions, its corresponding candidate feature point is a unique point:
μ(ω,σ n)>μ(w,σ n-1)
μ(ω,σ n)>μ(w,σ n+1)
μ(ω,σ n)>threshold L
9. the quick high accuracy matching process of depth image according to claim 1 and coloured image is characterized in that, the said interframe of step (1) smoothly is that the interframe between continuous 11 frames is level and smooth.
10. the quick high accuracy matching process of depth image according to claim 1 and coloured image is characterized in that, the said medium filtering of step (1) is 7 * 7 medium filtering.
CN2012100500338A 2012-02-29 2012-02-29 Rapid and high-precision matching method of depth image and color image Pending CN102629330A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012100500338A CN102629330A (en) 2012-02-29 2012-02-29 Rapid and high-precision matching method of depth image and color image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012100500338A CN102629330A (en) 2012-02-29 2012-02-29 Rapid and high-precision matching method of depth image and color image

Publications (1)

Publication Number Publication Date
CN102629330A true CN102629330A (en) 2012-08-08

Family

ID=46587588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100500338A Pending CN102629330A (en) 2012-02-29 2012-02-29 Rapid and high-precision matching method of depth image and color image

Country Status (1)

Country Link
CN (1) CN102629330A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915540A (en) * 2012-10-10 2013-02-06 南京大学 Image matching method based on improved Harris-Laplace and scale invariant feature transform (SIFT) descriptor
CN103164695A (en) * 2013-02-26 2013-06-19 中国农业大学 Fruit identification method based on multi-source image information fusion
CN103279315A (en) * 2013-04-24 2013-09-04 电子科技大学 Real-time desktop remote sharing method
CN103294813A (en) * 2013-06-07 2013-09-11 北京捷成世纪科技股份有限公司 Sensitive image search method and device
CN103985132A (en) * 2014-05-30 2014-08-13 中国科学院遥感与数字地球研究所 Mismatching point iterative detection method based on K neighbor graphs
CN105590108A (en) * 2016-02-25 2016-05-18 云南大学 Scene identification method used in noisy environment
CN105654423A (en) * 2015-12-28 2016-06-08 西安电子科技大学 Area-based remote sensing image registration method
CN104123532B (en) * 2013-04-28 2017-05-10 浙江大华技术股份有限公司 Target object detection and target object quantity confirming method and device
CN108335327A (en) * 2017-01-19 2018-07-27 富士通株式会社 Video camera Attitude estimation method and video camera attitude estimating device
CN108445443A (en) * 2018-02-07 2018-08-24 武汉大学 A kind of fingerprint point clustering method based on KNN
CN109712112A (en) * 2018-11-22 2019-05-03 中北大学 Taking photo by plane based on local feature is insulated the localization method of subgraph
CN110490923A (en) * 2019-08-22 2019-11-22 江西渥泰环保科技有限公司 The Pixel-level alignment methods of depth of field space-time 3-D image
CN111006676A (en) * 2019-11-14 2020-04-14 广东博智林机器人有限公司 Map construction method, device and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777129A (en) * 2009-11-25 2010-07-14 中国科学院自动化研究所 Image matching method based on feature detection

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777129A (en) * 2009-11-25 2010-07-14 中国科学院自动化研究所 Image matching method based on feature detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周志强等: "基于局部尺度不变特征的快速目标识别", 《光学技术》, vol. 34, no. 5, 15 September 2008 (2008-09-15), pages 742 - 745 *
温文雅等: "一种基于特征点的图像匹配算法", 《2009系统仿真技术及其应用学术会议论文集》, 1 August 2009 (2009-08-01), pages 752 - 756 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915540A (en) * 2012-10-10 2013-02-06 南京大学 Image matching method based on improved Harris-Laplace and scale invariant feature transform (SIFT) descriptor
CN103164695A (en) * 2013-02-26 2013-06-19 中国农业大学 Fruit identification method based on multi-source image information fusion
CN103164695B (en) * 2013-02-26 2016-02-17 中国农业大学 A kind of fruit identification method based on multi-source image information fusion
CN103279315A (en) * 2013-04-24 2013-09-04 电子科技大学 Real-time desktop remote sharing method
CN104123532B (en) * 2013-04-28 2017-05-10 浙江大华技术股份有限公司 Target object detection and target object quantity confirming method and device
CN103294813A (en) * 2013-06-07 2013-09-11 北京捷成世纪科技股份有限公司 Sensitive image search method and device
CN103985132A (en) * 2014-05-30 2014-08-13 中国科学院遥感与数字地球研究所 Mismatching point iterative detection method based on K neighbor graphs
CN103985132B (en) * 2014-05-30 2017-04-19 中国科学院遥感与数字地球研究所 Mismatching point iterative detection method based on K neighbor graphs
CN105654423B (en) * 2015-12-28 2019-03-26 西安电子科技大学 Remote sensing image registration method based on region
CN105654423A (en) * 2015-12-28 2016-06-08 西安电子科技大学 Area-based remote sensing image registration method
CN105590108A (en) * 2016-02-25 2016-05-18 云南大学 Scene identification method used in noisy environment
CN105590108B (en) * 2016-02-25 2018-08-17 云南大学 A kind of scene recognition method under noisy environment
CN108335327A (en) * 2017-01-19 2018-07-27 富士通株式会社 Video camera Attitude estimation method and video camera attitude estimating device
CN108335327B (en) * 2017-01-19 2021-09-24 富士通株式会社 Camera attitude estimation method and camera attitude estimation device
CN108445443A (en) * 2018-02-07 2018-08-24 武汉大学 A kind of fingerprint point clustering method based on KNN
CN108445443B (en) * 2018-02-07 2021-06-04 武汉大学 KNN-based fingerprint point clustering method
CN109712112A (en) * 2018-11-22 2019-05-03 中北大学 Taking photo by plane based on local feature is insulated the localization method of subgraph
CN109712112B (en) * 2018-11-22 2022-06-24 中北大学 Aerial photography insulator image positioning method based on local features
CN110490923A (en) * 2019-08-22 2019-11-22 江西渥泰环保科技有限公司 The Pixel-level alignment methods of depth of field space-time 3-D image
CN111006676A (en) * 2019-11-14 2020-04-14 广东博智林机器人有限公司 Map construction method, device and system
CN111006676B (en) * 2019-11-14 2021-08-31 广东博智林机器人有限公司 Map construction method, device and system

Similar Documents

Publication Publication Date Title
CN102629330A (en) Rapid and high-precision matching method of depth image and color image
CN103411553B (en) The quick calibrating method of multi-linear structured light vision sensors
Choi et al. RGB-D edge detection and edge-based registration
CN103020945B (en) A kind of remote sensing image registration method of Multiple Source Sensor
CN106204574B (en) Camera pose self-calibrating method based on objective plane motion feature
CN103700099B (en) Rotation and dimension unchanged wide baseline stereo matching method
CN101556692A (en) Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points
CN102661708B (en) High-density packaged element positioning method based on speeded up robust features (SURFs)
CN101621711B (en) Method for calibrating camera by adopting two same circles
CN101383899A (en) Video image stabilizing method for space based platform hovering
CN102313536A (en) Method for barrier perception based on airborne binocular vision
CN104599258A (en) Anisotropic characteristic descriptor based image stitching method
CN105389774A (en) Method and device for aligning images
CN104121902A (en) Implementation method of indoor robot visual odometer based on Xtion camera
CN104036480A (en) Surf algorithm based quick mismatching point eliminating method
CN103679193A (en) FREAK-based high-speed high-density packaging component rapid location method
CN113160287B (en) Complex component point cloud splicing method and system based on feature fusion
CN101488224B (en) Characteristic point matching method based on relativity measurement
CN103632142A (en) Local coordinate system feature description based image matching method
CN112862768B (en) Adaptive monocular VIO (visual image analysis) initialization method based on point-line characteristics
CN109308715A (en) A kind of optical imagery method for registering combined based on point feature and line feature
CN108154066B (en) Three-dimensional target identification method based on curvature characteristic recurrent neural network
CN105513094A (en) Stereo vision tracking method and stereo vision tracking system based on 3D Delaunay triangulation
CN103700082B (en) Image split-joint method based on dual quaterion relative orientation
CN103914690A (en) Shape matching method based on projective invariant

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20120808