CN112819959A - Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method - Google Patents
Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method Download PDFInfo
- Publication number
- CN112819959A CN112819959A CN202110086571.1A CN202110086571A CN112819959A CN 112819959 A CN112819959 A CN 112819959A CN 202110086571 A CN202110086571 A CN 202110086571A CN 112819959 A CN112819959 A CN 112819959A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- hyperspectral
- matrix
- hyper
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 239000011159 matrix material Substances 0.000 claims description 47
- 230000003595 spectral effect Effects 0.000 claims description 22
- 238000000354 decomposition reaction Methods 0.000 claims description 16
- 238000013507 mapping Methods 0.000 claims description 14
- 238000005286 illumination Methods 0.000 claims description 11
- 230000011218 segmentation Effects 0.000 claims description 8
- 239000013598 vector Substances 0.000 claims description 8
- 238000002310 reflectometry Methods 0.000 claims description 5
- 238000001228 spectrum Methods 0.000 abstract description 4
- 238000012545 processing Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 3
- 230000015556 catabolic process Effects 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention relates to a method for generating a hyperspectral point cloud intrinsic to a hyperspectral image and lidar data, relates to the technical field of remote sensing image processing, and aims to solve the problems that information loss, spectrum distortion and low efficiency are easily caused by the existing hyperspectral point cloud data generation method.
Description
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a hyper-spectral image and lidar data intrinsic hyper-spectral point cloud generation method based on hyper-voxels.
Background
In recent years, on one hand, a hyperspectral sensor and a laser radar sensor are gradually miniaturized and integratability is continuously improved, on the other hand, load and stability of an unmanned aerial vehicle platform carrying the sensors are greatly improved, and due to changes, hyperspectral-laser radar combination and synchronous data acquisition become one of mainstream remote sensing detection schemes more and more. When a large amount of hyperspectral and laser radar data are accumulated, how to integrate the two types of data also becomes a problem which needs to be solved urgently. The hyperspectral image and the laser radar point cloud data have complementarity and isomerism, and more specifically, the hyperspectral image has rich spectral information, but the spatial information is degradation from a three-dimensional image to a two-dimensional image; the laser radar point cloud can acquire accurate three-dimensional space information, but spectral information is relatively deficient.
The existing method for fusion processing of hyperspectral images and laser radar point clouds is roughly divided into two types, one is to generate two-dimensional DSM or DEM data by the laser radar point clouds, and then match the two-dimensional DSM or DEM data with the hyperspectral images; and the other method is that the spectral information of the hyperspectral image is distributed to the laser radar point cloud. Both of the two modes are based on the characteristics extracted from the hyperspectral image and the laser radar point cloud, the problem that spectral information or spatial three-dimensional information is lost exists, and meanwhile, the spectral information is also easily influenced by environmental illumination and cannot generate satisfactory integrated data.
Disclosure of Invention
The purpose of the invention is: aiming at the problems of information loss, spectrum distortion and low efficiency of the existing hyperspectral point cloud data generation method, a hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method is provided.
The technical scheme adopted by the invention to solve the technical problems is as follows:
the method for generating the hyperspectral image and the intrinsic hyperspectral point cloud of the laser radar data comprises the following steps:
the method comprises the following steps: acquiring a hyperspectral image and a laser radar point cloud, and obtaining an intrinsic mapping matrix according to the hyperspectral image and the laser radar point cloud;
step two: performing voxel segmentation on the laser radar point cloud to generate a voxel representation matrix;
step three: obtaining an incident illumination direction according to the intrinsic mapping matrix;
step four: and performing combined eigen decomposition based on the hyper-voxels according to the eigen mapping matrix, the hyper-voxel representation matrix and the incident illumination direction to generate an eigen hyperspectral point cloud.
Further, the specific steps of the first step are as follows:
inputting a hyperspectral imageAnd lidar point cloudWherein h isk=[hk(λ1),hk(λ2),…,hk(λd)]T,k=1,2,...,u,hkThe spectral characteristics of each pixel are represented by lambda, d and u, wherein lambda is the wavelength, d is the number of wave bands, and u is the total number of the hyperspectral image pixels; attribute feature p of each point clouds=[xs,ys,zs,Is]T1,2, v, wherein (x)s,ys,zs) For coordinate information of each point cloud, IsThe intensity of the laser radar is shown, and v is the total number of the laser radar point clouds;
calculating the normal of each point in the laser radar point cloud P to obtain corresponding normal characteristicsWherein n iss=[Nx,Ny,Nz]T1,2, v denotes the projection of the normal on the x, y, z spatial coordinate axes, and the matrix a is then calculatedijEach element of (a):
further, from AijAnd N, calculating to obtain each element of the eigen mapping matrix:
Further, the second step comprises the following specific steps:
mapping the laser radar point cloud P into a weighted graph G ═ V, E, and utilizing each vertex V in ViRepresenting each point P in a lidar point cloudiAnd each vertex V in Euclidean spaceiAll the edges are connected with 50 vertexes nearest to the vertex by an edge, and the weight W (E) of each edge E ∈ E is expressed as | | Ia-IbI, where a and b are the vertex V joined by the edge eaAnd VbSerial number of (2), initialization of hyper-voxel segmentation of point cloudsEnabling each hyper-voxel to correspond to each vertex, traversing edges in E according to the order of the weight from small to large, judging whether two vertexes connected by E belong to two different hyper-voxels, if not, continuing to the next edge, if so, judging whether W (E) is smaller than the distance between the two hyper-voxels connected by E, if so, merging the two hyper-voxels connected by E, if not, continuing to the next edge, and obtaining a hyper-voxel set after the traversal is finished, wherein the hyper-voxel set isWhere t is the number of voxels in the voxel, according toCalculating the hyper-voxel:
according to matrix definition and BijAnd then the super voxel representation matrix B is obtained.
Further, the third step comprises the following specific steps:
firstly, establishing a relational expression for each wave band lambda:
GL=0
then toMatrix G ofTG, performing characteristic decomposition, arranging the characteristic values obtained by decomposition and the characteristic vectors corresponding to the characteristic values from small to large according to the characteristic values, and expressing as follows: { e1,v1},{e2,v2And { e } and3,v3}, minimum eigenvalue e1Corresponding feature vector v1Then the direction of incident illumination L.
Further, the fourth step specifically comprises:
firstly, calculating the ith row and the jth column element of a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, wherein the ith row and the jth column element are expressed as follows:
then according to the matrix definition and MijObtaining a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, BjgRepresenting the jth row and jth column element of the supersubvoxel representation matrix B, Bjg,g=1,2,...,t,
Is provided withIs a set of eigen-high spectral reflectivities of hyper-voxels, the eigen-high spectral reflectance of each hyper-voxelThe joint eigen decomposition relationship is:
solved to obtainThen according toObtaining the intrinsic high spectral reflectivity of all point cloudsWherein r iss=[rs(λ1),rs(λ2),...,rs(λd)]TFor the intrinsic high spectral reflectance of each point cloud, the intrinsic high spectral reflectance r of each point cloud is finally determineds=[rs(λ1),rs(λ2),...,rs(λd)]TSpatial coordinates [ x ] of the Stack to lidar Point cloud Ps,ys,zs]TThen generating an intrinsic hyperspectral point cloudWherein
The invention has the beneficial effects that:
according to the method, a physical model of hyperspectral imaging is utilized, spectral information of a hyperspectral image and three-dimensional space xi information of a laser radar point cloud are combined from the angle of signals, the spectrum uncertainty of the hyperspectral image caused by the degradation of the spectral information caused by illumination is eliminated, the speed of an algorithm is greatly increased by a mode that a voxel faces an object, the intrinsic hyperspectral point cloud integrating spectral information and spatial three-dimensional information is finally generated, and information loss and spectrum distortion are greatly avoided.
Drawings
FIG. 1 is a flow chart of the present application;
FIG. 2 is an input hyperspectral image;
FIG. 3 is an input lidar point cloud;
FIG. 4 is a schematic diagram of the super voxel segmentation results generated by the present invention;
FIG. 5 is a schematic diagram of an intrinsic hyperspectral point cloud generated by the present invention;
FIG. 6 is a schematic diagram of the error comparison between the hyperspectral point cloud generated by the present invention and other methods.
Detailed Description
It should be noted that, in the present invention, the embodiments disclosed in the present application may be combined with each other without conflict.
The first embodiment is as follows: specifically describing the present embodiment with reference to fig. 1, the method for generating a hyperspectral image and lidar data-intrinsic hyperspectral point cloud according to the present embodiment includes the following steps:
the method comprises the following steps: acquiring a hyperspectral image and a laser radar point cloud, and obtaining an intrinsic mapping matrix according to the hyperspectral image and the laser radar point cloud;
step two: performing voxel segmentation on the laser radar point cloud to generate a voxel representation matrix;
step three: obtaining an incident illumination direction according to the intrinsic mapping matrix;
step four: and performing combined eigen decomposition based on the hyper-voxels according to the eigen mapping matrix, the hyper-voxel representation matrix and the incident illumination direction to generate an eigen hyperspectral point cloud.
Step 1: inputting a hyperspectral imageAnd lidar point cloudWherein h isk=[hk(λ1),hk(λ2),…,hk(λd)]TK is 1, 2., u is the spectral feature of each pixel, λ is the wavelength, d is the number of bands, and u is the number of hyperspectral image pixels; p is a radical ofk=[xk,yk,zk,Ik]T K 1,2,.., v, inner (x)k,yk,zk) Is the coordinate information of each point cloud, IkIs the intensity of the lidar and v is the number of lidar point clouds. Firstly, calculating the normal of each point in the point cloud P of the laser radar to obtain corresponding normal characteristicsWherein n isk=[Nx,Ny,Nz]TK is 1,2, v is the projection of the normal on the x, y, z spatial coordinate axes. Computing hyperspectral-lidar mapping matrices
Further, the eigenmapping matrix can be calculated from a and N:
step 2: mapping the lidar point cloud to a weighted graph G ═ V, E, such that each vertex V in V isiRepresenting individual points P in a lidar point cloudiAnd each vertex is connected with 50 vertexes nearest to the vertex in the Euclidean space by an edge, and the weight W (E) of each edge E belonging to the E is calculated as Ia-IbI, where a and b are the numbers of the two vertices that the edge e joins. Supervoxel segmentation of initialized point cloudsSuch that each hyper-voxel corresponds to each vertex itself. Traversing the edges in the E according to the order of the weight values from small to large, judging whether two vertexes connected by the E belong to two different hyper-voxels, and if not, continuing to obtain the next edge; if w (e) is less than the distance between those two superpixels, then the e-connected superpixels are merged, or else the next edge is continued. The set of hyper-voxels obtained after the traversal is finished isWhere t is the number of voxels. According toComputing a voxel representation matrixComprises the following steps:
and step 3: if two pixels H of the hyperspectral imageaAnd HbAnd (3) establishing a relational expression for each wave band lambda when the corresponding points on the laser radar point cloud all belong to the same voxel:
then, a series of equations for L obtained by the above formula are collated to obtain a system of equations for L:
GL=0
where each row of the matrix G corresponds to an equationIn (1)Part is oneThe row vector of (2). To pairMatrix GTG(GL=0,GT*G*L=G T0 ═ 0, L is the matrix GTG eigenvector), arranging the eigenvalue and eigenvector obtained by decomposition from the following to large according to the eigenvalue: { e1,v1},{e2,v2And { e } and3,v3}. The illumination direction L is equal to the minimum eigenvalue e1Corresponding feature vector v1。
And 4, step 4: computing a hyperspectral image-lidar point cloud combined eigen decomposition matrixSo that
Is provided withIs the intrinsic hyperspectral reflectivity of each of the voxels, whereinFrom this, the following joint eigen decomposition relation can be obtained:
solved to obtainThus, the intrinsic high spectral reflectance of each point cloudIs calculated asAnd finally generating an intrinsic hyperspectral point cloud:wherein
Wherein A represents a matrix of u x v,aij represents an element of matrix A at ith row and jth column;
the experiment designed by the invention comprises the following steps:
the data used in the experiment are data of Houston university, and FIG. 2 is an input hyperspectral image; FIG. 3 is an input lidar point cloud; FIG. 4 is a super voxel segmentation result generated by the present invention; FIG. 5 is an intrinsic hyperspectral point cloud generated by the present invention; FIG. 6 is an error comparison of the hyperspectral point cloud generated by the present invention with other methods. It can be seen from fig. 6 that the error of the intrinsic hyperspectral point cloud generated by the invention is minimum.
It should be noted that the detailed description is only for explaining and explaining the technical solution of the present invention, and the scope of protection of the claims is not limited thereby. It is intended that all such modifications and variations be included within the scope of the invention as defined in the following claims and the description.
Claims (5)
1. The method for generating the hyperspectral image and the intrinsic hyperspectral point cloud of the laser radar data is characterized by comprising the following steps of:
the method comprises the following steps: acquiring a hyperspectral image and a laser radar point cloud, and obtaining an intrinsic mapping matrix according to the hyperspectral image and the laser radar point cloud;
step two: performing voxel segmentation on the laser radar point cloud to generate a voxel representation matrix;
step three: obtaining an incident illumination direction according to the intrinsic mapping matrix;
step four: and performing combined eigen decomposition based on the hyper-voxels according to the eigen mapping matrix, the hyper-voxel representation matrix and the incident illumination direction to generate an eigen hyperspectral point cloud.
2. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 1, characterized in that the specific steps of the first step are:
inputting a hyperspectral imageAnd lidar point cloudWherein h isk=[hk(λ1),hk(λ2),…,hk(λd)]T,k=1,2,...,u,hkThe spectral characteristics of each pixel are represented by lambda, d and u, wherein lambda is the wavelength, d is the number of wave bands, and u is the total number of the hyperspectral image pixels; attribute feature p of each point clouds=[xs,ys,zs,Is]T1,2, v, wherein (x)s,ys,zs) For coordinate information of each point cloud, IsThe intensity of the laser radar is shown, and v is the total number of the laser radar point clouds;
calculating the normal of each point in the laser radar point cloud P to obtain corresponding normal characteristicsWherein n iss=[Nx,Ny,Nz]T1,2, v denotes the projection of the normal on the x, y, z spatial coordinate axes, and the matrix a is then calculatedijEach element of (a):
further, from AijAnd N is calculated to obtainEach element of the eigenmap matrix:
3. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 2, characterized in that the specific steps of the second step are:
mapping the laser radar point cloud P into a weighted graph G ═ V, E, and utilizing each vertex V in ViRepresenting each point P in a lidar point cloudiAnd each vertex V in Euclidean spaceiAll the edges are connected with 50 vertexes nearest to the vertex by an edge, and the weight W (E) of each edge E ∈ E is expressed as | | Ia-IbI, where a and b are the vertex V joined by the edge eaAnd VbSerial number of (2), initialization of hyper-voxel segmentation of point cloudsEnabling each hyper-voxel to correspond to each vertex, traversing edges in E according to the order of the weight from small to large, judging whether two vertexes connected by E belong to two different hyper-voxels, if not, continuing to the next edge, if so, judging whether W (E) is smaller than the distance between the two hyper-voxels connected by E, if so, merging the two hyper-voxels connected by E, if not, continuing to the next edge, and obtaining a hyper-voxel set after the traversal is finished, wherein the hyper-voxel set isWhere t is the number of voxels in the voxel, according toCalculating the hyper-voxel:
according to matrix definition and BijAnd then the super voxel representation matrix B is obtained.
4. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 3, characterized in that the specific steps of the third step are:
firstly, establishing a relational expression for each wave band lambda:
GL=0
then toMatrix G ofTG, performing characteristic decomposition, arranging the characteristic values obtained by decomposition and the characteristic vectors corresponding to the characteristic values from small to large according to the characteristic values, and expressing as follows: { e1,v1},{e2,v2And { e } and3,v3}, minimum eigenvalue e1Corresponding feature vector v1Then the direction of incident illumination L.
5. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 4, characterized in that the specific steps of the fourth step are:
firstly, calculating the ith row and the jth column element of a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, wherein the ith row and the jth column element are expressed as follows:
then according to the matrix definition and MijObtaining a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, BjgRepresenting the jth row and jth column element of the supersubvoxel representation matrix B, Bjg,g=1,2,...,t,
Is provided withIs a set of eigen-high spectral reflectivities of hyper-voxels, the eigen-high spectral reflectance of each hyper-voxelThe joint eigen decomposition relationship is:
solved to obtainThen according toObtaining the intrinsic high spectral reflectivity of all point cloudsWherein r iss=[rs(λ1),rs(λ2),...,rs(λd)]TFor the intrinsic high spectral reflectance of each point cloud, the intrinsic high spectral reflectance r of each point cloud is finally determineds=[rs(λ1),rs(λ2),...,rs(λd)]TSpatial coordinates [ x ] of the Stack to lidar Point cloud Ps,ys,zs]TThen generating an intrinsic hyperspectral point cloudWherein
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110086571.1A CN112819959B (en) | 2021-01-22 | 2021-01-22 | Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110086571.1A CN112819959B (en) | 2021-01-22 | 2021-01-22 | Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112819959A true CN112819959A (en) | 2021-05-18 |
CN112819959B CN112819959B (en) | 2022-03-04 |
Family
ID=75858945
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110086571.1A Active CN112819959B (en) | 2021-01-22 | 2021-01-22 | Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112819959B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113408635A (en) * | 2021-06-29 | 2021-09-17 | 哈尔滨工业大学 | Hyperspectral image eigen decomposition method based on assistance of digital surface model |
CN113936103A (en) * | 2021-12-14 | 2022-01-14 | 星际空间(天津)科技发展有限公司 | Method and equipment for constructing laser point cloud graph model |
CN115331110A (en) * | 2022-08-26 | 2022-11-11 | 苏州大学 | Fusion classification method and device for remote sensing hyperspectral image and laser radar image |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018045626A1 (en) * | 2016-09-07 | 2018-03-15 | 深圳大学 | Super-pixel level information fusion-based hyperspectral image classification method and system |
CN108427913A (en) * | 2018-02-05 | 2018-08-21 | 中国地质大学(武汉) | The Hyperspectral Image Classification method of combined spectral, space and hierarchy information |
CN109087341A (en) * | 2018-06-07 | 2018-12-25 | 华南农业大学 | A kind of fusion method of short distance EO-1 hyperion camera and distance measuring sensor |
-
2021
- 2021-01-22 CN CN202110086571.1A patent/CN112819959B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018045626A1 (en) * | 2016-09-07 | 2018-03-15 | 深圳大学 | Super-pixel level information fusion-based hyperspectral image classification method and system |
CN108427913A (en) * | 2018-02-05 | 2018-08-21 | 中国地质大学(武汉) | The Hyperspectral Image Classification method of combined spectral, space and hierarchy information |
CN109087341A (en) * | 2018-06-07 | 2018-12-25 | 华南农业大学 | A kind of fusion method of short distance EO-1 hyperion camera and distance measuring sensor |
Non-Patent Citations (2)
Title |
---|
YANFENG GU等: "UAV-based integrated multispectral-LiDAR imaging system and data processing", 《SCIENCE CHINA TECHNOLOGICAL SCIENCES》 * |
王青旺: "多高光谱图像和LiDAR数据联合分类方法研究", 《中国博士学位论文全文数据库 工程科技II辑》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113408635A (en) * | 2021-06-29 | 2021-09-17 | 哈尔滨工业大学 | Hyperspectral image eigen decomposition method based on assistance of digital surface model |
CN113936103A (en) * | 2021-12-14 | 2022-01-14 | 星际空间(天津)科技发展有限公司 | Method and equipment for constructing laser point cloud graph model |
CN115331110A (en) * | 2022-08-26 | 2022-11-11 | 苏州大学 | Fusion classification method and device for remote sensing hyperspectral image and laser radar image |
Also Published As
Publication number | Publication date |
---|---|
CN112819959B (en) | 2022-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112819959B (en) | Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method | |
CN110009674B (en) | Monocular image depth of field real-time calculation method based on unsupervised depth learning | |
CN110853075B (en) | Visual tracking positioning method based on dense point cloud and synthetic view | |
Yurtseven et al. | Determination and accuracy analysis of individual tree crown parameters using UAV based imagery and OBIA techniques | |
CN107204037B (en) | Three-dimensional image generation method based on active and passive three-dimensional imaging system | |
CN112686935B (en) | Airborne sounding radar and multispectral satellite image registration method based on feature fusion | |
CN110807828B (en) | Oblique photography three-dimensional reconstruction matching method | |
CN112130169B (en) | Point cloud level fusion method for laser radar data and hyperspectral image | |
CN103822616A (en) | Remote-sensing image matching method with combination of characteristic segmentation with topographic inequality constraint | |
Youssefi et al. | Cars: A photogrammetry pipeline using dask graphs to construct a global 3d model | |
CN113077552A (en) | DSM (digital communication system) generation method and device based on unmanned aerial vehicle image | |
Moghaddam et al. | A statistical variable selection solution for RFM ill-posedness and overparameterization problems | |
Bybee et al. | Method for 3-D scene reconstruction using fused LiDAR and imagery from a texel camera | |
Parmehr et al. | Automatic parameter selection for intensity-based registration of imagery to LiDAR data | |
CN113256696B (en) | External parameter calibration method of laser radar and camera based on natural scene | |
Hong et al. | Liv-gaussmap: Lidar-inertial-visual fusion for real-time 3d radiance field map rendering | |
CN112785693B (en) | Method, system and device for generating intrinsic hyperspectral point cloud | |
CN116994029A (en) | Fusion classification method and system for multi-source data | |
CN117197333A (en) | Space target reconstruction and pose estimation method and system based on multi-view vision | |
CN116152800A (en) | 3D dynamic multi-target detection method, system and storage medium based on cross-view feature fusion | |
CN116682105A (en) | Millimeter wave radar and visual feature attention fusion target detection method | |
CN114359660B (en) | Multi-modal target detection method and system suitable for modal intensity change | |
CN114565653A (en) | Heterogeneous remote sensing image matching method with rotation change and scale difference | |
Gonçalves | Using structure-from-motion workflows for 3D mapping and remote sensing | |
Kern et al. | An accurate real-time Uav mapping solution for the generation of Orthomosaics and surface models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |