CN105005993B - A kind of quick fine matching method of dimensional topography based on isomery projection - Google Patents

A kind of quick fine matching method of dimensional topography based on isomery projection Download PDF

Info

Publication number
CN105005993B
CN105005993B CN201510397177.4A CN201510397177A CN105005993B CN 105005993 B CN105005993 B CN 105005993B CN 201510397177 A CN201510397177 A CN 201510397177A CN 105005993 B CN105005993 B CN 105005993B
Authority
CN
China
Prior art keywords
perspective
view
matching
point
matched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510397177.4A
Other languages
Chinese (zh)
Other versions
CN105005993A (en
Inventor
刘贵喜
方兰兰
吕孟娇
张娜
姚李阳
唐海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201510397177.4A priority Critical patent/CN105005993B/en
Publication of CN105005993A publication Critical patent/CN105005993A/en
Application granted granted Critical
Publication of CN105005993B publication Critical patent/CN105005993B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of quick fine matching method of dimensional topography based on isomery projection.The three_dimensional topograph model method includes the acquisition and conversion of three-dimensional DEM (Digital Elevation Model) terrain data, carries out orthogonal projection to dimensional topography, perspective view matching is carried out again according to the orthogonal projection of landform.The method for perspective view combined with point feature using linear feature in matching process, after carrying out straight-line detection to perspective view and having matched homonymous line, the virtual angle point that homonymous line intersects two-by-two is found, and calculate the coordinate of the angle point.Then the virtual angle point found out is matched using improved SURF (Speed up robust features) algorithm, improved SURF algorithm combines SURF algorithm, HARRIS algorithms and NCC (Normalized Cross Correlation) algorithm.Corresponding to and if only if during corners Matching, just think that this two straight lines correctly match.The transformation relation between perspective view is calculated, then transformation parameter is applied to the matching between dimensional topography, so as to complete whole landform matching process.

Description

A kind of quick fine matching method of dimensional topography based on isomery projection
Technical field
The invention belongs to three_dimensional topograph model field, specially a kind of dimensional topography based on isomery projection is quick accurate Method of completing the square.
Background technology
Terrain match technology is one of key technology of Terrain-aided Navigation, and it is widely used in aviation field, under water Sea-floor relief matching positioning, robot navigation's positioning and land vehicle navigation of carrier etc., also there is wide answer Use prospect.
Existing terrain match algorithm is a lot, and also someone is studying always both at home and abroad, and algorithm is also in continuous retrofit. In existing method, have and carry out landform as characteristic point using the minimum and maximum point of Gaussian curvature in Gaussian curvature image The method matched somebody with somebody.Useful normalized wavelet descriptor carries out the matching process based on terrain profile of description and the matching of profile, But the method does not apply in situations such as terrain profile unobvious.Have proves direct 2D to 3D's using the method for visual dictionary Matching process can improve matching performance, but the method data volume will influence greatly very much internal memory, the time length of consuming, matching During do not carry out by mistake descriptor rejecting.There is a kind of Terrain Matching Algorithm based on region feature, this side later The shortcomings that method is that the time of matching can be very long, and matching speed is too slow if terrain data is too big or topographic map is complicated.Therefore, The present invention proposes a kind of quick fine matching method of new dimensional topography based on landform orthogonal projection, makes full use of landform Surface characteristics, it is not homologous, but the image of isomery between perspective view under many situations.The image of isomery lacks gray scale letter Breath, traditional image matching method have not applied to.The method that the present invention is combined using linear feature with point feature is projected Figure matching.First with the homonymous line in matching line segments algorithmic match image, the intersection point conduct intersected two-by-two to homonymous line Virtual angle point, recycle point feature to carry out second and match, reject the straight line of error hiding.Used when being matched to point feature It is improvement SURF (Speed-up robust features) algorithm with more high matching precision, the algorithm combines SURF calculations Method, HARRIS algorithms and NCC (Normalized Cross Correlation) algorithm.
The content of the invention
It is an object of the invention to provide a kind of quick fine matching method of dimensional topography based on isomery projection, to improve Three_dimensional topograph model efficiency and matching precision.
The object of the present invention is achieved like this, a kind of quick fine matching method of dimensional topography based on isomery projection, Characterized in that, comprise at least following steps:
Step 1, by original figure elevation model terrain data form USGS-DEM with being converted to digital elevation model figurate numbers According to form CNSTDF-DEM;
Step 2, the reference dimensional topography to topography format described in step 1 and dimensional topography to be matched carry out orthogonal projection;
Step 3, the orthogonal projection according to step 2, is matched to perspective view, obtains perspective view transformation parameter, The method combined with process using linear feature and point feature;
Step 4, the transformation parameter according to step 3, to being matched with reference to dimensional topography and dimensional topography to be matched And obtain matching result.
The step 1, comprises the following steps:
Step 11, a terrain file is opened;
Step 12, whether the file for judging to open is USGS-DEM terrain data forms;
Step 13, if not step 11 is then gone to, if it is start to read data head;
Step 14, related terrain file header is extracted;
Step 15, storage file header;
Step 16, data space is opened up;
Step 17, data volume is read;
Step 18, preceding 144 bytes of the often row of data volume are filtered;
Step 19, data storage relevant information;
Step 110, the file header and data volume with reference to storage save as CNSTDF-DEM topography format files;
Step 111, terrain transition is completed.
The step 2, comprises the following steps:
Step 21, obtained three-dimensional CNSTDF-DEM terrain datas are opened;
Step 22, topographical surface textural characteristics are obtained;
Step 23, dimensional topography texture is obtained;
Step 24, landform orthogonal projection figure is obtained according to texture.
The step 3, comprises the following steps:
Step 31, treat matching pursuit figure and the detection and matching of straight line are carried out with reference to perspective view, match homonymous line Afterwards, using the intersection point of straight line intersection two-by-two as the virtual angle point of perspective view to be matched in perspective view to be matched;
Step 32, with reference in perspective view using the intersection point of straight line intersection two-by-two as referring to the virtual angle point of perspective view;
Step 33, SURF algorithm is carried out to virtual angle point slightly to match;
Step 34, SURF algorithm essence matching is carried out to virtual angle point;
Step 35, matched by first time, obtained the accurate feature points of reference picture and image subject to registration to set φAB, pass through φABCan be to ask for the perspective transformation matrix H between reference picture and image subject to registration;
Step 36, treat matching pursuit figure and line translation is entered according to the transformation matrix H obtained;
Step 37, treat matching pursuit figure and enter row interpolation;
Step 38, convert interpolation and obtain intermediate image afterwards;
Step 39, obtain and refer to perspective view and perspective view to be matched;
Step 310, find out with reference to the lap between perspective view and perspective view to be matched as respective region of interest Domain, and it is polylith subregion to divide reference picture according to overlapping region size.When overlapping region is larger, subregion size is 64 × 64, when overlapping region is smaller, subregion also diminishes accordingly;
Step 311, in the subregion with reference to perspective view, centered on regional center, 32 × 32 around, i.e. reference chart As subregion size 0.5 times of neighborhood in carry out HARRIS feature point extractions, take in the region R values in all HARRIS characteristic points Maximum, i.e., most there is the point of discrimination with surrounding point as the characteristic point with reference to perspective view;If do not have in 32 × 32 neighborhoods HARRIS characteristic points, then sub-areas center handled as a characteristic point;
Step 312, after all reference perspective view feature point extractions are complete, NCC algorithmic match is carried out;
Step 313, in intermediate image, by with reference to 96 × 96 centered on perspective view feature point coordinates, i.e. reference projection Scanned in 1.5 times of neighborhoods of figure subregion size, record coefficient correlation and its feature point coordinates, obtain thick match point;
Step 314, after the completion of 96 × 96 range searchings, the coefficient correlation of the thick match point of more all records, select Maximum coefficient correlation, carry out threshold value TNCCLimit;
Step 315, if greater than given threshold value TNCC, then corresponding coordinate point is as the characteristic point of intermediate image and with reference to throwing The smart match point of shadow figure characteristic point;
Step 316, it is fitted according to smart matching double points with least square method;
Step 317, obtain intermediate image and with reference to the transformation matrix between perspective view, obtain the last conversion ginseng of projection matching Number, complete perspective view matching.
The step 4, comprises the following steps:
Step 41, the transformation parameter obtained after the completion of projection matching is obtained;
Step 42, return in dimensional topography, dimensional topography is changed;
Step 43, three_dimensional topograph model is completed.
This method includes the acquisition and conversion of three-dimensional DEM (Digital Elevation Model) terrain data, to three-dimensional Landform carries out orthogonal projection, and perspective view matching is carried out again according to the orthogonal projection of landform.Perspective view is carried out to adopt in matching process The method combined with linear feature with point feature, after carrying out straight-line detection to perspective view and having matched homonymous line, find The virtual angle point that homonymous line intersects two-by-two, and calculate the coordinate of the angle point.Then using improved SURF algorithm to finding out Virtual angle point matched, corresponding to and if only if during corners Matching, just think that this two straight lines correctly match.Calculate throwing Transformation relation between shadow figure, then transformation parameter is applied to the matching between dimensional topography, so as to complete whole terrain match Process.The present invention is matched using dimensional topography projection, is a kind of new method of three_dimensional topograph model.The present invention can apply In unmanned plane vision guided navigation, under degeneration environment, there is situation about blocking, the projection for isomery is matched.
The beneficial effects of the invention are as follows:The surface characteristics of landform is made full use of, the orthogonal projection progress to dimensional topography Match somebody with somebody, there is provided a kind of new Terrain Matching Algorithm.For the topographic projection of isomery, the algorithm is also suitable.Improved SURF Point feature algorithm improves the precision of projection matching.
Brief description of the drawings
Fig. 1 flow charts of the present invention;
Fig. 2 Terrain Data Form Transformation flow charts;
Fig. 3 topographic projections figure obtains;
Fig. 4 improves SURF algorithm flow chart;
Fig. 5 terrain match processes;
The SURF of Fig. 6 isomery perspective views is with improving SURF algorithm result.
Embodiment
As shown in figure 1, the flow chart step of three_dimensional topograph model is characterized in:
Step 1, by original figure elevation model terrain data form USGS-DEM with being converted to digital elevation model figurate numbers According to form CNSTDF-DEM;
Step 2, the reference dimensional topography to topography format described in step 1 and dimensional topography to be matched carry out orthogonal projection;
Step 3, the orthogonal projection according to step 2, is matched to perspective view, obtains perspective view transformation parameter, The method combined with process using linear feature and point feature;
Step 4, the transformation parameter according to step 3, to being matched with reference to dimensional topography and dimensional topography to be matched And obtain matching result.
As shown in Fig. 2 the step 1, comprises the following steps, it is characterized in that:
Step 11, a terrain file is opened;
Step 12, whether the file for judging to open is USGS-DEM terrain data forms;
Step 13, if not step 11 is then gone to, if it is start to read data head;
Step 14, related terrain file header is extracted;
Step 15, storage file header;
Step 16, data space is opened up;
Step 17, data volume is read;
Step 18, preceding 144 bytes of the often row of data volume are filtered;
Step 19, data storage relevant information;
Step 110, the file header and data volume with reference to storage save as CNSTDF-DEM topography format files;
Step 111, terrain transition is completed.
As shown in figure 3, the step 2, comprises the following steps, it is characterized in that:
Step 21, obtained three-dimensional CNSTDF-DEM terrain datas are opened;
Step 22, topographical surface textural characteristics are obtained;
Step 23, dimensional topography texture is obtained;
Step 24, landform orthogonal projection figure is obtained according to texture.
As shown in figure 4, the step 3, comprises the following steps, it is characterized in that:
Step 31, treat matching pursuit figure and the detection and matching of straight line are carried out with reference to perspective view, match homonymous line Afterwards, using the intersection point of straight line intersection two-by-two as the virtual angle point of perspective view to be matched in perspective view to be matched;
Step 32, with reference in perspective view using the intersection point of straight line intersection two-by-two as referring to the virtual angle point of perspective view.
Step 33, SURF algorithm is carried out to virtual angle point slightly to match;
Step 34, SURF algorithm essence matching is carried out to virtual angle point;
Step 35, matched by first time, obtained the accurate feature points of reference picture and image subject to registration to set φAB, pass through φABCan be to ask for the perspective transformation matrix H between reference picture and image subject to registration;
Step 36, treat matching pursuit figure and line translation is entered according to the transformation matrix H obtained;
Step 37, treat matching pursuit figure and enter row interpolation;
Step 38, convert interpolation and obtain intermediate image afterwards;
Step 39, obtain and refer to perspective view and perspective view to be matched;
Step 310, find out with reference to the lap between perspective view and perspective view to be matched as respective region of interest Domain, and it is polylith subregion to divide reference picture according to overlapping region size.When overlapping region is larger, subregion size is 64 × 64, when overlapping region is smaller, subregion also diminishes accordingly;
Step 311, in the subregion with reference to perspective view, centered on regional center, 32 × 32 around, i.e. reference chart As subregion size 0.5 times of neighborhood in carry out HARRIS feature point extractions, take in the region R values in all HARRIS characteristic points Maximum, i.e., most there is the point of discrimination with surrounding point as the characteristic point with reference to perspective view;If do not have in 32 × 32 neighborhoods HARRIS characteristic points, then sub-areas center handled as a characteristic point;
Step 312, after all reference perspective view feature point extractions are complete, NCC algorithmic match is carried out;
Step 313, in intermediate image, (to refer to perspective view with reference to 96 × 96 centered on perspective view feature point coordinates 1.5 times of subregion size) scan in region, record coefficient correlation and its feature point coordinates, obtain thick match point;
Step 314, after the completion of 96 × 96 range searchings, the coefficient correlation of the thick match point of more all records, select Maximum coefficient correlation, carry out threshold value TNCCLimit;
Step 315, if greater than given threshold value TNCC, then corresponding coordinate point is as the characteristic point of intermediate image and with reference to throwing The smart match point of shadow figure characteristic point.
Step 316, it is fitted according to smart matching double points with least square method;
Step 317, obtain intermediate image and with reference to the transformation matrix between perspective view, obtain the last conversion ginseng of projection matching Number, complete perspective view matching.
As shown in figure 5, the step 4, comprises the following steps, it is characterized in that:
Step 41, the transformation parameter obtained after the completion of projection matching is obtained;
Step 42, return in dimensional topography, dimensional topography is changed;
Step 43, three_dimensional topograph model is completed.
As shown in fig. 6, Fig. 6 (a) be landform visible ray perspective view, Fig. 6 (b) be landform infrared projection figure, Fig. 6 (c) For the result matched with SURF algorithm, matching precision is 0.2108 pixel, and Fig. 6 (d) is to be entered with improvement SURF algorithm herein The result of row matching, matching precision is 0.0338 pixel.As a result show to improve the precision that SURF algorithm improves matching process.
The part for not having to describe in detail in step belongs to conventional means well known in the art and algorithm, does not describe one by one here.

Claims (4)

1. a kind of quick fine matching method of dimensional topography based on isomery projection, it is characterised in that including at least following steps:
Step 1, original figure elevation model terrain data form USGS-DEM is converted into digital elevation model terrain data lattice Formula CNSTDF-DEM;
Step 2, to digital elevation model terrain data form CNSTDF-DEM reference dimensional topography described in step 1 and to be matched Dimensional topography carries out orthogonal projection;
Step 3, the orthogonal projection according to step 2, is matched to perspective view, is obtained perspective view transformation parameter, was matched The method that Cheng Caiyong linear features and point feature combine;
Step 4, the transformation parameter according to step 3, to being matched and being obtained with reference to dimensional topography and dimensional topography to be matched To matching result;
The step 3, comprises the following steps:
Step 31, treat matching pursuit figure and the detection and matching of straight line are carried out with reference to perspective view, after having matched homonymous line, Using the intersection point of straight line intersection two-by-two as the virtual angle point of perspective view to be matched in perspective view to be matched;
Step 32, with reference in perspective view using the intersection point of straight line intersection two-by-two as referring to the virtual angle point of perspective view;
Step 33, SURF algorithm is carried out to virtual angle point slightly to match;
Step 34, SURF algorithm essence matching is carried out to virtual angle point;
Step 35, matching that the two is combined by the thick matching of step 33 and the essence matching of step 34, obtained reference picture and The accurate feature points of image subject to registration are to set φAB, pass through φABCan be saturating between reference picture and image subject to registration to ask for Depending on transformation matrix H;
Step 36, treat matching pursuit figure and line translation is entered according to the transformation matrix H obtained;
Step 37, treat matching pursuit figure and enter row interpolation;
Step 38, convert interpolation and obtain intermediate image afterwards;
Step 39, obtain and refer to perspective view and perspective view to be matched;
Step 310, find out with reference to the lap between perspective view and perspective view to be matched as respective area-of-interest, and And it is polylith subregion to divide reference picture according to overlapping region size;When overlapping region is larger, subregion size be 64 × 64, when overlapping region is smaller, subregion also diminishes accordingly;
Step 311, in the subregion with reference to perspective view, centered on regional center, 32 × 32 around, i.e., reference picture is sub HARRIS feature point extractions are carried out in 0.5 times of neighborhood of area size, take in the region R values maximum in all HARRIS characteristic points , i.e., most there is the point of discrimination with surrounding point as the characteristic point with reference to perspective view;If do not have in 32 × 32 neighborhoods HARRIS characteristic points, then sub-areas center handled as a characteristic point;
Step 312, after all reference perspective view feature point extractions are complete, NCC algorithmic match is carried out;
Step 313, in intermediate image, 96 × 96 centered on reference perspective view feature point coordinates, i.e., with reference to perspective view Scanned in 1.5 times of neighborhoods of area size, record coefficient correlation and its feature point coordinates, obtain thick match point;
Step 314, after the completion of 96 × 96 range searchings, the coefficient correlation of the thick match point of more all records, maximum is selected Coefficient correlation, carry out threshold value TNCCLimit;
Step 315, if greater than given threshold value TNCC, then corresponding coordinate point is as the characteristic point of intermediate image and with reference to perspective view The smart match point of characteristic point;
Step 316, it is fitted according to smart matching double points with least square method;
Step 317, the transformation matrix between intermediate image and reference perspective view is obtained, obtains the last transformation parameter of projection matching, Complete perspective view matching.
2. a kind of quick fine matching method of dimensional topography based on isomery projection according to claim 1, its feature exist In:The step 1, comprises the following steps:
Step 11, a terrain file is opened;
Step 12, whether the file for judging to open is USGS-DEM terrain data forms;
Step 13, if not step 11 is then gone to, if it is start to read data head;
Step 14, related terrain file header is extracted;
Step 15, storage file header;
Step 16, data space is opened up;
Step 17, data volume is read;
Step 18, preceding 144 bytes of the often row of data volume are filtered;
Step 19, data storage relevant information;
Step 110, the file header and data volume with reference to storage save as CNSTDF-DEM topography format files;
Step 111, terrain transition is completed.
3. a kind of quick fine matching method of dimensional topography based on isomery projection according to claim 1, its feature exist In:The step 2, comprises the following steps:
Step 21, obtained three-dimensional CNSTDF-DEM terrain datas are opened;
Step 22, topographical surface textural characteristics are obtained;
Step 23, dimensional topography texture is obtained;
Step 24, landform orthogonal projection figure is obtained according to texture.
4. a kind of quick fine matching method of dimensional topography based on isomery projection according to claim 1, its feature exist In:The step 4, comprises the following steps:
Step 41, the transformation parameter obtained after the completion of projection matching is obtained;
Step 42, return in dimensional topography, dimensional topography is changed;
Step 43, three_dimensional topograph model is completed.
CN201510397177.4A 2015-07-08 2015-07-08 A kind of quick fine matching method of dimensional topography based on isomery projection Expired - Fee Related CN105005993B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510397177.4A CN105005993B (en) 2015-07-08 2015-07-08 A kind of quick fine matching method of dimensional topography based on isomery projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510397177.4A CN105005993B (en) 2015-07-08 2015-07-08 A kind of quick fine matching method of dimensional topography based on isomery projection

Publications (2)

Publication Number Publication Date
CN105005993A CN105005993A (en) 2015-10-28
CN105005993B true CN105005993B (en) 2018-02-16

Family

ID=54378650

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510397177.4A Expired - Fee Related CN105005993B (en) 2015-07-08 2015-07-08 A kind of quick fine matching method of dimensional topography based on isomery projection

Country Status (1)

Country Link
CN (1) CN105005993B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016221680B4 (en) * 2016-11-04 2022-06-15 Audi Ag Method for operating a semi-autonomous or autonomous motor vehicle and motor vehicle
CN110728713B (en) * 2018-07-16 2022-09-30 Oppo广东移动通信有限公司 Test method and test system
CN117132796B (en) * 2023-09-09 2024-10-01 廊坊市珍圭谷科技有限公司 Position efficient matching method based on heterogeneous projection

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104618045A (en) * 2015-01-27 2015-05-13 北京交通大学 Collected data-based wireless channel transmission model establishing method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4970296B2 (en) * 2008-01-21 2012-07-04 株式会社パスコ Orthophoto image generation method and photographing apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104618045A (en) * 2015-01-27 2015-05-13 北京交通大学 Collected data-based wireless channel transmission model establishing method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
3种常用DEM格式自动化互换的初探;李山山 等;《测绘与空间地理信息》;20080630;第31卷(第3期);第6-8,11页 *
高频信息矢量匹配实现异源图像配准;韩广良 等;《中国光学》;20111031;第4卷(第5期);第468-473页 *

Also Published As

Publication number Publication date
CN105005993A (en) 2015-10-28

Similar Documents

Publication Publication Date Title
CN112270249B (en) Target pose estimation method integrating RGB-D visual characteristics
CN107145874B (en) Ship target detection and identification method in complex background SAR image
US10325152B1 (en) Method of extracting warehouse in port from hierarchically screened remote sensing image
CN109544612B (en) Point cloud registration method based on feature point geometric surface description
CN104318548B (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN101901343B (en) Remote sensing image road extracting method based on stereo constraint
CN103426186B (en) A kind of SURF fast matching method of improvement
CN104200461B (en) The remote sensing image registration method of block and sift features is selected based on mutual information image
CN113034600B (en) Template matching-based texture-free planar structure industrial part identification and 6D pose estimation method
CN106529591A (en) Improved MSER image matching algorithm
CN102800074B (en) Synthetic aperture radar (SAR) image change detection difference chart generation method based on contourlet transform
CN106203342A (en) Target identification method based on multi-angle local feature coupling
CN104933434A (en) Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN104850850A (en) Binocular stereoscopic vision image feature extraction method combining shape and color
CN106919944A (en) A kind of wide-angle image method for quickly identifying based on ORB algorithms
CN104820991A (en) Multi-soft-constraint stereo matching method based on cost matrix
CN102147867B (en) Method for identifying traditional Chinese painting images and calligraphy images based on subject
CN108491786A (en) A kind of method for detecting human face based on hierarchical network and Cluster merging
CN105005993B (en) A kind of quick fine matching method of dimensional topography based on isomery projection
CN106709499A (en) SIFT image feature point extraction method based on Canny operator and Hilbert-Huang transform
CN110544300B (en) Method for automatically generating three-dimensional model based on two-dimensional hand-drawn image characteristics
CN113409332B (en) Building plane segmentation method based on three-dimensional point cloud
CN102903111B (en) Large area based on Iamge Segmentation low texture area Stereo Matching Algorithm
CN105160686A (en) Improved scale invariant feature transformation (SIFT) operator based low altitude multi-view remote-sensing image matching method
CN111105390B (en) Improved sea-sky-line detection and evaluation method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180216

Termination date: 20180708