CN104732553B - A kind of Feature Points Extraction based on many laser assisted targets - Google Patents

A kind of Feature Points Extraction based on many laser assisted targets Download PDF

Info

Publication number
CN104732553B
CN104732553B CN201510169317.2A CN201510169317A CN104732553B CN 104732553 B CN104732553 B CN 104732553B CN 201510169317 A CN201510169317 A CN 201510169317A CN 104732553 B CN104732553 B CN 104732553B
Authority
CN
China
Prior art keywords
laser
striation
point
target
characteristic point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510169317.2A
Other languages
Chinese (zh)
Other versions
CN104732553A (en
Inventor
贾振元
刘巍
徐鹏涛
王灵丽
刘阳
杨景豪
王亮
樊超楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201510169317.2A priority Critical patent/CN104732553B/en
Publication of CN104732553A publication Critical patent/CN104732553A/en
Application granted granted Critical
Publication of CN104732553B publication Critical patent/CN104732553B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A kind of Feature Points Extraction based on many laser assisted targets of the present invention belongs to image procossing and Computer Vision Detection field, more particularly to using the processing method of view data in the splicing of laser assisted multi-view data, the extracting method of a kind of drone design based on many laser assisteds and characteristic point is specifically referred to.In the laser array projection laser striation construction splicing target that this method is constituted using noncontact multi-station laser, target, using each intersection point of laser network as the characteristic point of splicing, target co-ordinates position is extracted using high-precision target feature point extraction algorithm.The present invention uses the target based on laser assisted, and the adhesive type target that compares has more preferable adaptability, by first determining the method that area-of-interest accurately extracts characteristic point again, effectively increases the precision and efficiency of feature point extraction.

Description

A kind of Feature Points Extraction based on many laser assisted targets
Technical field
The invention belongs to image procossing and Computer Vision Detection field, more particularly to spelled using laser assisted multi-view data The processing method of middle view data is connect, the extraction of a kind of drone design based on many laser assisteds and characteristic point is specifically referred to Method.
Background technology
In multi-view data splicing, the design of target and its accurate extraction of characteristic point have important shadow to splicing precision Ring.Conventional splicing target is roughly divided into adhesive type, placed type, three kinds of projection-type at present, and adhesive type and placed type are easily blocked Measured object topographical information, and scene adjustment is inconvenient, the adaptability to adverse circumstances is poor.And conventional projection-type splicing target Many existing characteristics points extract the problem of difficult, precision is low.It regard the characteristic point of target as the characteristic point finally spliced, traditional spy Levying a little typically has specific distribution of shapes, by taking common circular light spot as an example, because without specific intensity profile, and only Feature point for calibration position is calculated by fit characteristic edge, inevitably there is error.Merely by improving hardware Resolution ratio is quite huge to improve the cost of positioning feature point precision, thus needs to be considered as carrying for high-precision characteristic point Algorithm is taken to determine the coordinate position of characteristic point.
In July, 2006, Sun Junhua etc. was in periodical《Mechanical engineering journal》On deliver that " vision based on plane baseline target is surveyed A kind of plane target drone of black and white checker is proposed in the text of amount data joining method " one, text, and is made with the angle point of black and white grid For the characteristic point of splicing, the target has higher precision, but in the case of measured object partial occlusion, grid angle point is due to hiding The uncertainty of gear situation, the situation for error hiding often occur, and the target can not be adjusted flexibly with measurement needs, to high temperature etc. Severe industrial and mineral adaptability is poor.In April, 2009, Liu Xiao profits et al. were in periodical《Acta Optica》On deliver " by the depth of index point One kind is proposed in the text of data global registration method " one, text white circular mark point is pasted in public view field as splicing target Calibration method, removes picture noise by gaussian filtering, edge coarse positioning is realized using edge detection operator, then further right Ellipse is accurately positioned, and finally Edge Feature Points are carried out with the sub-pixel positioning that least square fitting obtains the center of circle, this method There is relatively complicated stickup and remove markers work, the shortcoming of binding mark position point cloud missing, easily infringement measured object surface.
The content of the invention
The technical problems to be solved by the invention are the defects for overcoming prior art, and invention is a kind of based on many laser assisteds Target, laser network pattern is obtained using laser line generator array, using the intersection point of horizontal, longitudinal striation as splicing characteristic point, adopted The method construct splicing target projected with non-contact laser, target co-ordinates position is extracted using high-precision target feature point extraction algorithm Put, to improve the precision and stability of target feature point extraction.
The present invention adopts the technical scheme that a kind of Feature Points Extraction based on many laser assisted targets, its feature It is, the laser array projection laser striation construction splicing target that this method is constituted using noncontact multi-station laser, in target, Each intersection point of laser network extracts target co-ordinates position as the characteristic point of splicing using high-precision target feature point extraction algorithm Put;The initial position of characteristic point is obtained by etching operation first, then area-of-interest is determined using initial position, finally exists In the area-of-interest identified, using iteration weight weighted least-squares method technology, using the laser striation model of invention with solid The coordinate of multiple points on transverse and longitudinal striation is extracted at fixed interval, obtains two matched curves of transverse and longitudinal striation, feature is used as using its intersection point The exact position of point;Extracting method is comprised the following steps that:
Step 1:Based on laser assisted drone design
The present invention constitutes laser array using many laser line generators, and laser striation is projected to the surface of measured object, structure A laser network is produced as laser assisted target, target image is shot afterwards;To ensure the robustness of laser striation extraction, The laser optical strip image of local overexposure is obtained by adjusting camera exposure parameters, laser light stripe pitch is by changing laser Position is adjusted flexibly;Laser assisted target formation each grid intersection point as splicing characteristic point;
Step 2:Laser striation model
For the laser striation projected in step 1, a kind of local overexposure laser striation model is set up:
Wherein, u is the pixel coordinate of laser striation normal direction, and n is item number, ai,bi,ciIt is the coefficient of each single item;Laser striation Model after u derivations to that can be denoted as:
According to the intensity profile of laser striation cross-wise direction, the local extremum of striation zone of saturation is defined as in striation Heart point;
Step 3:The determination of characteristic point initial position and area-of-interest
The target image that step 1 is obtained is pre-processed using medium filtering to image, and pretreated image is not only Remove image impulse noise, and the detailed information remained with to a certain extent;Binary conversion treatment and corruption are carried out to image Erosion operation, rationally sets etching operation threshold value, the isolated connected region of each grid intersection point is obtained, with the gray scale of connected region The heart as characteristic point initial positionWherein, i, j refer to row of the characteristic point in laser network respectively Number and columns, i=1,2 ... 5, j=1,2 ... 5, uijAnd vijRefer to each corresponding initial position u axles and v axles in the picture respectively Pixel coordinate.
Area-of-interest is the r using characteristic point initial position as the center of circleROIFor the border circular areas of pixel radius;In laser network The rough position of quick obtaining characteristic point in image, and identify area-of-interest and can effectively reduce region of search, improve special Levy an extraction efficiency;
Step 4:Determine characteristic point accurate coordinates
In the area-of-interest that step 3 is identified, the local saturation laser that laser striation central point is proposed according to step 2 Striation model is calculated;The laser striation of vertical direction, v represents the searching position with fixed pixel interval d, for matching striation letter Breath, every striation extracts 10 centre coordinates;Identical v axles v is used in area-of-interestk, can obtain one by scanning is Laser light data on row section, these initial data are matched with saturation laser striation model, xkIt is the u axles seat of central point Mark;Central point writing (the x of striationk,vk), then obtain including the point set V={ V of all central pointsk(xk,vk), wherein k=1, 2…10;It is similarly obtained the center point set U={ U of horizontal striationk(uk,yk), wherein (uk,yk) it is k-th of fixation u axles uk's Central point, k=1,2 ... 10;
Vertical, the laser striation central point collection matched curve of horizontal direction is utilized respectively, with two laser striation matched curves Intersection point as characteristic point accurate coordinates;Using based on iteration weight weighted least-squares method technology, by characteristic point exact position p (i, j) is defined as the intersection point of two fitting laser straight lines, wherein i=1,2 ... 5, j=1,2 ... 5;Equally, projection target puts on each The exact position of characteristic point can be obtained.
The beneficial effects of the invention are as follows the target based on laser assisted is used, the adhesive type target that compares has preferably suitable Ying Xing, by first determine area-of-interest again accurately extract characteristic point method, effectively increase feature point extraction precision and Efficiency.
Brief description of the drawings
Fig. 1 is the projection pattern based on many laser assisted targets.
Fig. 2 is the intensity profile curve map of four kinds of typical local overexposure laser striation models, wherein, axis of abscissas u (as Element), axis of ordinates is the corresponding gray value of striation, and (a), (b) and (c) figure are that the intensity profile of striation only one of which peak value is bent Line, (d) figure is the intensity profile curve that striation has three extreme values, and a is the center for extracting point.
Fig. 3 is feature point extraction figure in area-of-interest, wherein, p0(i, j) represents the initial position of characteristic point, and ROI is Area-of-interest is determined by initial position.
Fig. 4 is the accurate feature points p (i, j) obtained in area-of-interest by curve matching.
Fig. 5 is that, based on laser assisted target feature point extraction result figure, left figure is the first row first row characteristic point p (1,1) Extraction result enlarged drawing.
Embodiment
Describe the embodiment of the present invention in detail with reference to technical scheme and accompanying drawing.The present invention is based on many laser The target and Feature Points Extraction of auxiliary, embodiment project one 5 × 5 using 10 laser line generators on measured object surface Laser network, using the intersection point of horizontal, longitudinal striation as the characteristic point of splicing, is pre-processed after shooting image and is obtained characteristic point Initial position, it is then determined that area-of-interest, and further obtain the accurate coordinates of characteristic point.
Step 1:Based on laser assisted drone design
The present invention constitutes laser array using ten laser line generators, and laser striation is projected to the surface of measured object, structure The laser network of one 5 × 5 is made as laser assisted target, 25 grid intersection points of formation as splicing characteristic point.Laser Striation wavelength is 650nm, shoots the image of laser assisted target, as shown in Figure 1.
Step 2:Laser striation model
For the laser striation projected in step 1, a kind of local overexposure laser striation model is set up.Represent in fig 2 Go out the intensity profile curve map of four kinds of typical local overexposure laser striation models, wherein, abscissa is pixel position where striation Put, ordinate is the corresponding gray value of striation, and (a), (b) and (c) is the intensity profile curve of only one of which peak value, and (d) is to have The intensity profile curve of three extreme values.
Using formula (1):
Laser striation model is set up, wherein, u is the pixel coordinate of laser striation normal direction, and n is item number, ai,bi,ciIt is each The coefficient of item.Due to the limitation of striation width in image, n is set to 2, then one has 6 parameters in fitting function:a1,b1, c1,a2,b2,c2;Shown in the derivative of model such as aforementioned formula (2).Using obtaining laser striation intensity profile figure, root in MATLAB According to the intensity profile of laser striation, the local extremum of striation zone of saturation is defined as to the central point of striation.If only one of which Peak value, corresponding n position is central point.If there are 3 extreme values in zone of saturation, there is second derivative feature at two Extreme value in the middle of extreme value centered on point.
Step 3:The determination of characteristic point initial position and area-of-interest
The target image that step 1 is obtained is pre-processed using medium filtering to image, and pretreated image is not only Remove image impulse noise, and the detailed information remained with to a certain extent.Binary conversion treatment and corruption are carried out to image Erosion operation, rationally sets etching operation threshold value, obtains 25 isolated connected regions, the ash of connected region is obtained using MATLAB Degree center as characteristic point initial positionWherein i=1,2 ... 5, j=1,2 ... 5.
Area-of-interest is the r using characteristic point initial position as the center of circleROIFor the border circular areas of pixel radius, accompanying drawing 3 is served as reasons The area-of-interest figure that initial position is determined.According to experimental field image, by rROIIt is set to 80 pixels.In laser network image The rough position of middle quick obtaining characteristic point, and identify area-of-interest and can effectively reduce region of search, improve characteristic point Extraction efficiency.
Step 4:Determine characteristic point accurate coordinates
In the area-of-interest that step 3 is identified, the local saturation laser that laser striation central point can be proposed with step 2 Striation model is calculated.The laser striation of vertical direction, The searching position with fixed pixel interval d is represented, wherein, d is set to 4, and every striation extracts 10 Individual centre coordinate.Identical v axles v is used in area-of-interestk, a series of laser striation on sections can be obtained by scanning Data, these initial data can be matched with saturation laser striation model, xkIt is the u axial coordinates of central point.The center of striation Point writing (xk,vk), then it can obtain including the point set V={ V of all central pointsk(xk,vk)}.Likewise it is possible to obtain level The center point set U={ U of striationk(uk,yk), wherein, (uk,yk) it is k-th of fixation u axles ukCentral point, k=1,2 ... 10.
Obtain after optical losses point, for rejecting abnormalities value, using based on iteration weight weighted least-squares method technology.Cause This, exact position p (i, j) of characteristic point is defined as the intersection point of two fitting laser straight lines, wherein i=1,2 ... 5, j=1,2 ... 5.Accompanying drawing 5 is that, based on laser assisted target feature point extraction result figure, p (1,1) is the column position of the first row first in laser network The characteristic point extracted.In addition, if the surface of laser projection is not plane but a free form surface, two laser striations are bent Line can be fitted with polynomial equation, can equally calculate intersection point.
The present invention devises the target based on laser assisted, constitutes laser array using ten laser line generators, projects 5 × 5 laser network, and using 25 grid intersection points as splicing characteristic point, first using coarse positioning obtain area-of-interest and then The accurate method for finding characteristic point, the high-efficiency high-accuracy for realizing characteristic point is extracted.

Claims (1)

1. a kind of Feature Points Extraction based on many laser assisted targets, it is characterized in that, this method uses many of noncontact In the laser array projection laser striation construction splicing target that laser is constituted, target, each intersection point of laser network is made For the characteristic point of splicing, target co-ordinates position is extracted using high-precision target feature point extraction algorithm;Pass through etching operation first The initial position of characteristic point is obtained, then area-of-interest is determined using initial position, finally in the area-of-interest identified It is interior, using iteration weight weighted least-squares method technology, transverse and longitudinal striation is extracted with fixed intervals using the laser striation model of invention The coordinate of upper multiple points, is obtained two matched curves of transverse and longitudinal striation, the exact position of characteristic point is used as using its intersection point;Extraction side Method is comprised the following steps that:
Step 1:Based on laser assisted drone design
The present invention constitutes laser array using many laser line generators, and laser striation is projected to the surface of measured object, constructed One laser network shoots target image afterwards as laser assisted target;To ensure the robustness of laser striation extraction, pass through Regulation camera exposure parameters obtain the laser optical strip image of local overexposure, and laser light stripe pitch is by changing the position of laser It is adjusted flexibly;Laser assisted target formation each grid intersection point as splicing characteristic point;
Step 2:Laser striation model
For the laser striation projected in step 1, a kind of laser striation model is set up:
f ( u ) = Σ i = 1 n a i s i n ( b i u + c i ) - - - ( 1 )
Wherein, u is the pixel coordinate of laser striation normal direction, and n is item number, ai,bi,ciIt is the coefficient of each single item;Laser striation model To being denoted as after u derivations:
f ′ ( u ) = d f d u = Σ i = 1 n a i b i cos ( b i u + c i ) f ′ ′ ( u ) = d 2 f du 2 = Σ i = 1 n - a i b i 2 cos ( b i u + c i ) - - - ( 2 )
According to the intensity profile of laser striation cross-wise direction, the local extremum of striation zone of saturation is defined as to the center of striation Point;
Step 3:The determination of characteristic point initial position and area-of-interest
The target image that step 1 is obtained is pre-processed using medium filtering to image, and pretreated image is not only removed Image impulse noise, and the detailed information remained with;Binary conversion treatment and etching operation are carried out to image, rationally sets rotten Operational threshold is lost, the isolated connected region of each grid intersection point is obtained, it is first using the gray scale center of connected region as characteristic point Beginning positionWherein, i, j refer to line number and columns of the characteristic point in laser network respectively, i=1, 2 ... 5, j=1,2 ... 5, uijAnd vijRefer to the pixel coordinate of each corresponding initial position u axles and v axles in the picture respectively;
Area-of-interest is the r using characteristic point initial position as the center of circleROIFor the border circular areas of pixel radius;In laser network image The rough position of middle quick obtaining characteristic point, and identify area-of-interest and can effectively reduce region of search, improve characteristic point Extraction efficiency;
Step 4:Determine characteristic point accurate coordinates
In the area-of-interest that step 3 is identified, the laser striation model meter that laser striation central point is proposed according to step 2 Calculate;The laser striation of vertical direction, v represents the searching position with fixed pixel interval d, for matching striation information, every light Bar extracts 10 centre coordinates;Identical v axles v is used in area-of-interestk, a series of laser on sections is obtained by scanning Striation data, these initial data are matched with laser striation model, then obtain including the point set V={ V of all central pointsk(xk, vk), wherein, (xk,vk) it is k-th of fixation v axles vkCentral point, k=1,2 ... 10;It is similarly obtained the center of horizontal striation Point set U={ Uk(uk,yk), wherein, (uk,yk) it is k-th of fixation u axles ukCentral point, k=1,2 ... 10;
Vertical, the laser striation central point collection matched curve of horizontal direction is utilized respectively, with the friendship of two laser striation matched curves Put the accurate coordinates as characteristic point;Using based on iteration weight weighted least-squares method technology, by characteristic point exact position p (i, J) intersection point of two fitting laser straight lines, wherein i=1,2 ... 5, j=1,2 ... 5 are defined as;Equally, projection target puts on each spy The exact position levied a little can be obtained.
CN201510169317.2A 2015-04-10 2015-04-10 A kind of Feature Points Extraction based on many laser assisted targets Active CN104732553B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510169317.2A CN104732553B (en) 2015-04-10 2015-04-10 A kind of Feature Points Extraction based on many laser assisted targets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510169317.2A CN104732553B (en) 2015-04-10 2015-04-10 A kind of Feature Points Extraction based on many laser assisted targets

Publications (2)

Publication Number Publication Date
CN104732553A CN104732553A (en) 2015-06-24
CN104732553B true CN104732553B (en) 2017-07-11

Family

ID=53456421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510169317.2A Active CN104732553B (en) 2015-04-10 2015-04-10 A kind of Feature Points Extraction based on many laser assisted targets

Country Status (1)

Country Link
CN (1) CN104732553B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204593B (en) * 2016-07-12 2018-11-09 大连理工大学 A kind of Feature Points Extraction based on sequence image synthesis
CN106355613B (en) * 2016-08-31 2019-06-28 中国工程物理研究院激光聚变研究中心 The method for automatically extracting cross centre of figure based on least square fitting iteration
CN106485252B (en) * 2016-09-08 2019-06-21 上海卫星工程研究所 Dot matrix target image Feature point recognition method is tested in image registration
CN107392849B (en) * 2017-06-27 2021-02-05 中国航空工业集团公司北京长城航空测控技术研究所 Target identification and positioning method based on image subdivision
CN107563991B (en) * 2017-08-01 2019-08-20 大连理工大学 Piece surface is broken extraction and the matching process of laser striation
CN109737883A (en) * 2018-12-21 2019-05-10 成都蕴才汇智科技有限责任公司 A kind of three-dimensional deformation dynamic measurement system and measurement method based on image recognition
CN111442739B (en) * 2020-04-02 2022-03-01 上海海洋大学 Calibration system for underwater three-dimensional imaging of moving water tank and construction method
CN111854935B (en) * 2020-09-24 2021-01-22 歌尔股份有限公司 Projection system illumination measuring method, apparatus and readable storage medium
CN112669379B (en) * 2020-12-22 2024-01-02 大连工业大学 Image feature rapid extraction method based on auxiliary mark points
CN113096191B (en) * 2020-12-23 2022-08-16 合肥工业大学 Intelligent calibration method for monocular camera based on coding plane target
CN112907490B (en) * 2021-01-29 2023-01-20 合肥工业大学 Pixel point extraction method, target information acquisition method and push bench control system
CN112991316B (en) * 2021-03-30 2022-07-01 中国空气动力研究与发展中心超高速空气动力研究所 Dynamic measurement technology for model edge ablation amount
WO2024000295A1 (en) * 2022-06-29 2024-01-04 深圳华大生命科学研究院 Method and apparatus for determining coordinates of point in image, and corresponding device and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1398721A2 (en) * 2002-09-13 2004-03-17 GE Medical Systems Global Technology Company LLC Computer assisted analysis of tomographic mammography data
CN103727927A (en) * 2013-12-19 2014-04-16 大连理工大学 High-velocity motion object pose vision measurement method based on structured light
CN103913131A (en) * 2014-04-14 2014-07-09 大连理工大学 Free curve method vector measurement method based on binocular vision
CN103940369A (en) * 2014-04-09 2014-07-23 大连理工大学 Quick morphology vision measuring method in multi-laser synergic scanning mode

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013058174A (en) * 2011-09-09 2013-03-28 Fujitsu Ltd Image processing program, image processing method, and image processing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1398721A2 (en) * 2002-09-13 2004-03-17 GE Medical Systems Global Technology Company LLC Computer assisted analysis of tomographic mammography data
CN103727927A (en) * 2013-12-19 2014-04-16 大连理工大学 High-velocity motion object pose vision measurement method based on structured light
CN103940369A (en) * 2014-04-09 2014-07-23 大连理工大学 Quick morphology vision measuring method in multi-laser synergic scanning mode
CN103913131A (en) * 2014-04-14 2014-07-09 大连理工大学 Free curve method vector measurement method based on binocular vision

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
A robust method to extract a laser stripe centre based on grey level moment;Sun Qiucheng等;《Optics and Lasers in Engineering》;20141205;第67卷;第122-127页 *
An improved image acquiring method for machine vision measurement of hot formed parts;Jia Zhenyuan等;《Journal of Materials Processing Technology》;20101231;第210卷;第267-271页 *
Dimensional measurement of hot, large forgings with stereo vision structured light system;B.Wang等;《Proceedings of the Institution of Mechanical Engineers Part B Journal of Engineering Manufacture》;20110630;第225卷(第6期);第901-908页 *
Unbiased extraction of lines with parabolic and Gaussian profiles;Carsten Steger;《Computer Vision and Image Understanding》;20120908;第117卷;第97-112页 *
基于多尺度分析的激光光条中心点坐标提取方法;李凤娇等;《光学学报》;20141130;第34卷(第11期);第1110002-1页-第111002-6页 *

Also Published As

Publication number Publication date
CN104732553A (en) 2015-06-24

Similar Documents

Publication Publication Date Title
CN104732553B (en) A kind of Feature Points Extraction based on many laser assisted targets
CN103822616B (en) A kind of figure segmentation retrains with topographic relief the Remote Sensing Images Matching Method combined
CN103295239B (en) A kind of autoegistration method of the laser point cloud data based on datum plane image
CN106485690A (en) Cloud data based on a feature and the autoregistration fusion method of optical image
WO2016106955A1 (en) Laser infrared composite ground building recognition and navigation method
Li et al. RIFT: Multi-modal image matching based on radiation-invariant feature transform
CN103530880A (en) Camera calibration method based on projected Gaussian grid pattern
CN105931232A (en) High-precision sub-pixel extraction method for centers of light bars of structured light
CN116740288B (en) Three-dimensional reconstruction method integrating laser radar and oblique photography
CN112669379B (en) Image feature rapid extraction method based on auxiliary mark points
CN112950650B (en) Deep learning distorted light spot center extraction method suitable for high-precision morphology measurement
CN105701776B (en) A kind of lens distortion antidote and system for automatic optics inspection
CN103617649A (en) Camera self-calibration technology-based river model topography measurement method
CN106485737A (en) Cloud data based on line feature and the autoregistration fusion method of optical image
CN109035213A (en) Optical losses sub-pixel extraction based on striation section Energy distribution uniqueness
CN110988909A (en) TLS-based vegetation coverage determination method for sandy land vegetation in alpine and fragile areas
CN104318583A (en) Visible light broadband spectrum image registration method
CN104715474A (en) High-resolution synthetic aperture radar image linear building detecting method based on marked watershed algorithm
CN113379738A (en) Method and system for detecting and positioning epidemic trees based on images
Axelsson et al. Roof type classification using deep convolutional neural networks on low resolution photogrammetric point clouds from aerial imagery
CN105809673A (en) SURF (Speeded-Up Robust Features) algorithm and maximal similarity region merging based video foreground segmentation method
Wang Automatic extraction of building outline from high resolution aerial imagery
CN112862898A (en) Flow velocity measuring method based on computer vision
CN112767459A (en) Unmanned aerial vehicle laser point cloud and sequence image registration method based on 2D-3D conversion
CN104751451B (en) Point off density cloud extracting method based on unmanned plane low latitude high resolution image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant