CN112819959A - Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method - Google Patents

Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method Download PDF

Info

Publication number
CN112819959A
CN112819959A CN202110086571.1A CN202110086571A CN112819959A CN 112819959 A CN112819959 A CN 112819959A CN 202110086571 A CN202110086571 A CN 202110086571A CN 112819959 A CN112819959 A CN 112819959A
Authority
CN
China
Prior art keywords
point cloud
hyperspectral
matrix
hyper
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110086571.1A
Other languages
Chinese (zh)
Other versions
CN112819959B (en
Inventor
谷延锋
金旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202110086571.1A priority Critical patent/CN112819959B/en
Publication of CN112819959A publication Critical patent/CN112819959A/en
Application granted granted Critical
Publication of CN112819959B publication Critical patent/CN112819959B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to a method for generating a hyperspectral point cloud intrinsic to a hyperspectral image and lidar data, relates to the technical field of remote sensing image processing, and aims to solve the problems that information loss, spectrum distortion and low efficiency are easily caused by the existing hyperspectral point cloud data generation method.

Description

Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a hyper-spectral image and lidar data intrinsic hyper-spectral point cloud generation method based on hyper-voxels.
Background
In recent years, on one hand, a hyperspectral sensor and a laser radar sensor are gradually miniaturized and integratability is continuously improved, on the other hand, load and stability of an unmanned aerial vehicle platform carrying the sensors are greatly improved, and due to changes, hyperspectral-laser radar combination and synchronous data acquisition become one of mainstream remote sensing detection schemes more and more. When a large amount of hyperspectral and laser radar data are accumulated, how to integrate the two types of data also becomes a problem which needs to be solved urgently. The hyperspectral image and the laser radar point cloud data have complementarity and isomerism, and more specifically, the hyperspectral image has rich spectral information, but the spatial information is degradation from a three-dimensional image to a two-dimensional image; the laser radar point cloud can acquire accurate three-dimensional space information, but spectral information is relatively deficient.
The existing method for fusion processing of hyperspectral images and laser radar point clouds is roughly divided into two types, one is to generate two-dimensional DSM or DEM data by the laser radar point clouds, and then match the two-dimensional DSM or DEM data with the hyperspectral images; and the other method is that the spectral information of the hyperspectral image is distributed to the laser radar point cloud. Both of the two modes are based on the characteristics extracted from the hyperspectral image and the laser radar point cloud, the problem that spectral information or spatial three-dimensional information is lost exists, and meanwhile, the spectral information is also easily influenced by environmental illumination and cannot generate satisfactory integrated data.
Disclosure of Invention
The purpose of the invention is: aiming at the problems of information loss, spectrum distortion and low efficiency of the existing hyperspectral point cloud data generation method, a hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method is provided.
The technical scheme adopted by the invention to solve the technical problems is as follows:
the method for generating the hyperspectral image and the intrinsic hyperspectral point cloud of the laser radar data comprises the following steps:
the method comprises the following steps: acquiring a hyperspectral image and a laser radar point cloud, and obtaining an intrinsic mapping matrix according to the hyperspectral image and the laser radar point cloud;
step two: performing voxel segmentation on the laser radar point cloud to generate a voxel representation matrix;
step three: obtaining an incident illumination direction according to the intrinsic mapping matrix;
step four: and performing combined eigen decomposition based on the hyper-voxels according to the eigen mapping matrix, the hyper-voxel representation matrix and the incident illumination direction to generate an eigen hyperspectral point cloud.
Further, the specific steps of the first step are as follows:
inputting a hyperspectral image
Figure BDA0002910970420000021
And lidar point cloud
Figure BDA0002910970420000022
Wherein h isk=[hk1),hk2),…,hkd)]T,k=1,2,...,u,hkThe spectral characteristics of each pixel are represented by lambda, d and u, wherein lambda is the wavelength, d is the number of wave bands, and u is the total number of the hyperspectral image pixels; attribute feature p of each point clouds=[xs,ys,zs,Is]T1,2, v, wherein (x)s,ys,zs) For coordinate information of each point cloud, IsThe intensity of the laser radar is shown, and v is the total number of the laser radar point clouds;
calculating the normal of each point in the laser radar point cloud P to obtain corresponding normal characteristics
Figure BDA0002910970420000023
Wherein n iss=[Nx,Ny,Nz]T1,2, v denotes the projection of the normal on the x, y, z spatial coordinate axes, and the matrix a is then calculatedijEach element of (a):
Figure BDA0002910970420000024
further, from AijAnd N, calculating to obtain each element of the eigen mapping matrix:
Figure BDA0002910970420000025
according to matrix definition and
Figure BDA0002910970420000026
further obtain an eigenmapping matrix
Figure BDA0002910970420000027
Further, the second step comprises the following specific steps:
mapping the laser radar point cloud P into a weighted graph G ═ V, E, and utilizing each vertex V in ViRepresenting each point P in a lidar point cloudiAnd each vertex V in Euclidean spaceiAll the edges are connected with 50 vertexes nearest to the vertex by an edge, and the weight W (E) of each edge E ∈ E is expressed as | | Ia-IbI, where a and b are the vertex V joined by the edge eaAnd VbSerial number of (2), initialization of hyper-voxel segmentation of point clouds
Figure BDA0002910970420000028
Enabling each hyper-voxel to correspond to each vertex, traversing edges in E according to the order of the weight from small to large, judging whether two vertexes connected by E belong to two different hyper-voxels, if not, continuing to the next edge, if so, judging whether W (E) is smaller than the distance between the two hyper-voxels connected by E, if so, merging the two hyper-voxels connected by E, if not, continuing to the next edge, and obtaining a hyper-voxel set after the traversal is finished, wherein the hyper-voxel set is
Figure BDA0002910970420000029
Where t is the number of voxels in the voxel, according to
Figure BDA00029109704200000210
Calculating the hyper-voxel:
Figure BDA00029109704200000211
according to matrix definition and BijAnd then the super voxel representation matrix B is obtained.
Further, the third step comprises the following specific steps:
firstly, establishing a relational expression for each wave band lambda:
Figure BDA0002910970420000031
GL=0
wherein the matrix G is
Figure BDA0002910970420000032
A row vector of (a);
then to
Figure BDA0002910970420000033
Matrix G ofTG, performing characteristic decomposition, arranging the characteristic values obtained by decomposition and the characteristic vectors corresponding to the characteristic values from small to large according to the characteristic values, and expressing as follows: { e1,v1},{e2,v2And { e } and3,v3}, minimum eigenvalue e1Corresponding feature vector v1Then the direction of incident illumination L.
Further, the fourth step specifically comprises:
firstly, calculating the ith row and the jth column element of a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, wherein the ith row and the jth column element are expressed as follows:
Figure BDA0002910970420000034
then according to the matrix definition and MijObtaining a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, BjgRepresenting the jth row and jth column element of the supersubvoxel representation matrix B, Bjg,g=1,2,...,t,
Is provided with
Figure BDA0002910970420000035
Is a set of eigen-high spectral reflectivities of hyper-voxels, the eigen-high spectral reflectance of each hyper-voxel
Figure BDA0002910970420000036
The joint eigen decomposition relationship is:
Figure BDA0002910970420000037
solved to obtain
Figure BDA0002910970420000038
Then according to
Figure BDA0002910970420000039
Obtaining the intrinsic high spectral reflectivity of all point clouds
Figure BDA00029109704200000310
Wherein r iss=[rs1),rs2),...,rsd)]TFor the intrinsic high spectral reflectance of each point cloud, the intrinsic high spectral reflectance r of each point cloud is finally determineds=[rs1),rs2),...,rsd)]TSpatial coordinates [ x ] of the Stack to lidar Point cloud Ps,ys,zs]TThen generating an intrinsic hyperspectral point cloud
Figure BDA00029109704200000311
Wherein
Figure BDA00029109704200000312
The attribute features of each hyperspectral point cloud.
The invention has the beneficial effects that:
according to the method, a physical model of hyperspectral imaging is utilized, spectral information of a hyperspectral image and three-dimensional space xi information of a laser radar point cloud are combined from the angle of signals, the spectrum uncertainty of the hyperspectral image caused by the degradation of the spectral information caused by illumination is eliminated, the speed of an algorithm is greatly increased by a mode that a voxel faces an object, the intrinsic hyperspectral point cloud integrating spectral information and spatial three-dimensional information is finally generated, and information loss and spectrum distortion are greatly avoided.
Drawings
FIG. 1 is a flow chart of the present application;
FIG. 2 is an input hyperspectral image;
FIG. 3 is an input lidar point cloud;
FIG. 4 is a schematic diagram of the super voxel segmentation results generated by the present invention;
FIG. 5 is a schematic diagram of an intrinsic hyperspectral point cloud generated by the present invention;
FIG. 6 is a schematic diagram of the error comparison between the hyperspectral point cloud generated by the present invention and other methods.
Detailed Description
It should be noted that, in the present invention, the embodiments disclosed in the present application may be combined with each other without conflict.
The first embodiment is as follows: specifically describing the present embodiment with reference to fig. 1, the method for generating a hyperspectral image and lidar data-intrinsic hyperspectral point cloud according to the present embodiment includes the following steps:
the method comprises the following steps: acquiring a hyperspectral image and a laser radar point cloud, and obtaining an intrinsic mapping matrix according to the hyperspectral image and the laser radar point cloud;
step two: performing voxel segmentation on the laser radar point cloud to generate a voxel representation matrix;
step three: obtaining an incident illumination direction according to the intrinsic mapping matrix;
step four: and performing combined eigen decomposition based on the hyper-voxels according to the eigen mapping matrix, the hyper-voxel representation matrix and the incident illumination direction to generate an eigen hyperspectral point cloud.
Step 1: inputting a hyperspectral image
Figure BDA0002910970420000041
And lidar point cloud
Figure BDA0002910970420000042
Wherein h isk=[hk1),hk2),…,hkd)]TK is 1, 2., u is the spectral feature of each pixel, λ is the wavelength, d is the number of bands, and u is the number of hyperspectral image pixels; p is a radical ofk=[xk,yk,zk,Ik]T K 1,2,.., v, inner (x)k,yk,zk) Is the coordinate information of each point cloud, IkIs the intensity of the lidar and v is the number of lidar point clouds. Firstly, calculating the normal of each point in the point cloud P of the laser radar to obtain corresponding normal characteristics
Figure BDA0002910970420000043
Wherein n isk=[Nx,Ny,Nz]TK is 1,2, v is the projection of the normal on the x, y, z spatial coordinate axes. Computing hyperspectral-lidar mapping matrices
Figure BDA0002910970420000044
Figure BDA0002910970420000045
Further, the eigenmapping matrix can be calculated from a and N:
Figure BDA0002910970420000046
step 2: mapping the lidar point cloud to a weighted graph G ═ V, E, such that each vertex V in V isiRepresenting individual points P in a lidar point cloudiAnd each vertex is connected with 50 vertexes nearest to the vertex in the Euclidean space by an edge, and the weight W (E) of each edge E belonging to the E is calculated as Ia-IbI, where a and b are the numbers of the two vertices that the edge e joins. Supervoxel segmentation of initialized point clouds
Figure BDA0002910970420000051
Such that each hyper-voxel corresponds to each vertex itself. Traversing the edges in the E according to the order of the weight values from small to large, judging whether two vertexes connected by the E belong to two different hyper-voxels, and if not, continuing to obtain the next edge; if w (e) is less than the distance between those two superpixels, then the e-connected superpixels are merged, or else the next edge is continued. The set of hyper-voxels obtained after the traversal is finished is
Figure BDA0002910970420000052
Where t is the number of voxels. According to
Figure BDA0002910970420000053
Computing a voxel representation matrix
Figure BDA0002910970420000054
Comprises the following steps:
Figure BDA0002910970420000055
and step 3: if two pixels H of the hyperspectral imageaAnd HbAnd (3) establishing a relational expression for each wave band lambda when the corresponding points on the laser radar point cloud all belong to the same voxel:
Figure BDA0002910970420000056
then, a series of equations for L obtained by the above formula are collated to obtain a system of equations for L:
GL=0
where each row of the matrix G corresponds to an equation
Figure BDA0002910970420000057
In (1)
Figure BDA0002910970420000058
Part is one
Figure BDA0002910970420000059
The row vector of (2). To pair
Figure BDA00029109704200000510
Matrix GTG(GL=0,GT*G*L=G T0 ═ 0, L is the matrix GTG eigenvector), arranging the eigenvalue and eigenvector obtained by decomposition from the following to large according to the eigenvalue: { e1,v1},{e2,v2And { e } and3,v3}. The illumination direction L is equal to the minimum eigenvalue e1Corresponding feature vector v1
And 4, step 4: computing a hyperspectral image-lidar point cloud combined eigen decomposition matrix
Figure BDA00029109704200000511
So that
Figure BDA00029109704200000512
Is provided with
Figure BDA00029109704200000513
Is the intrinsic hyperspectral reflectivity of each of the voxels, wherein
Figure BDA00029109704200000514
From this, the following joint eigen decomposition relation can be obtained:
Figure BDA00029109704200000515
solved to obtain
Figure BDA00029109704200000516
Thus, the intrinsic high spectral reflectance of each point cloud
Figure BDA00029109704200000517
Is calculated as
Figure BDA00029109704200000518
And finally generating an intrinsic hyperspectral point cloud:
Figure BDA00029109704200000519
wherein
Figure BDA0002910970420000061
Wherein A represents a matrix of u x v,
Figure BDA0002910970420000062
aij represents an element of matrix A at ith row and jth column;
the experiment designed by the invention comprises the following steps:
the data used in the experiment are data of Houston university, and FIG. 2 is an input hyperspectral image; FIG. 3 is an input lidar point cloud; FIG. 4 is a super voxel segmentation result generated by the present invention; FIG. 5 is an intrinsic hyperspectral point cloud generated by the present invention; FIG. 6 is an error comparison of the hyperspectral point cloud generated by the present invention with other methods. It can be seen from fig. 6 that the error of the intrinsic hyperspectral point cloud generated by the invention is minimum.
It should be noted that the detailed description is only for explaining and explaining the technical solution of the present invention, and the scope of protection of the claims is not limited thereby. It is intended that all such modifications and variations be included within the scope of the invention as defined in the following claims and the description.

Claims (5)

1. The method for generating the hyperspectral image and the intrinsic hyperspectral point cloud of the laser radar data is characterized by comprising the following steps of:
the method comprises the following steps: acquiring a hyperspectral image and a laser radar point cloud, and obtaining an intrinsic mapping matrix according to the hyperspectral image and the laser radar point cloud;
step two: performing voxel segmentation on the laser radar point cloud to generate a voxel representation matrix;
step three: obtaining an incident illumination direction according to the intrinsic mapping matrix;
step four: and performing combined eigen decomposition based on the hyper-voxels according to the eigen mapping matrix, the hyper-voxel representation matrix and the incident illumination direction to generate an eigen hyperspectral point cloud.
2. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 1, characterized in that the specific steps of the first step are:
inputting a hyperspectral image
Figure FDA0002910970410000011
And lidar point cloud
Figure FDA0002910970410000012
Wherein h isk=[hk1),hk2),…,hkd)]T,k=1,2,...,u,hkThe spectral characteristics of each pixel are represented by lambda, d and u, wherein lambda is the wavelength, d is the number of wave bands, and u is the total number of the hyperspectral image pixels; attribute feature p of each point clouds=[xs,ys,zs,Is]T1,2, v, wherein (x)s,ys,zs) For coordinate information of each point cloud, IsThe intensity of the laser radar is shown, and v is the total number of the laser radar point clouds;
calculating the normal of each point in the laser radar point cloud P to obtain corresponding normal characteristics
Figure FDA0002910970410000013
Wherein n iss=[Nx,Ny,Nz]T1,2, v denotes the projection of the normal on the x, y, z spatial coordinate axes, and the matrix a is then calculatedijEach element of (a):
Figure FDA0002910970410000014
further, from AijAnd N is calculated to obtainEach element of the eigenmap matrix:
Figure FDA0002910970410000015
according to matrix definition and
Figure FDA0002910970410000016
further obtain an eigenmapping matrix
Figure FDA0002910970410000017
3. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 2, characterized in that the specific steps of the second step are:
mapping the laser radar point cloud P into a weighted graph G ═ V, E, and utilizing each vertex V in ViRepresenting each point P in a lidar point cloudiAnd each vertex V in Euclidean spaceiAll the edges are connected with 50 vertexes nearest to the vertex by an edge, and the weight W (E) of each edge E ∈ E is expressed as | | Ia-IbI, where a and b are the vertex V joined by the edge eaAnd VbSerial number of (2), initialization of hyper-voxel segmentation of point clouds
Figure FDA0002910970410000018
Enabling each hyper-voxel to correspond to each vertex, traversing edges in E according to the order of the weight from small to large, judging whether two vertexes connected by E belong to two different hyper-voxels, if not, continuing to the next edge, if so, judging whether W (E) is smaller than the distance between the two hyper-voxels connected by E, if so, merging the two hyper-voxels connected by E, if not, continuing to the next edge, and obtaining a hyper-voxel set after the traversal is finished, wherein the hyper-voxel set is
Figure FDA0002910970410000021
Where t is the number of voxels in the voxel, according to
Figure FDA0002910970410000022
Calculating the hyper-voxel:
Figure FDA0002910970410000023
according to matrix definition and BijAnd then the super voxel representation matrix B is obtained.
4. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 3, characterized in that the specific steps of the third step are:
firstly, establishing a relational expression for each wave band lambda:
Figure FDA0002910970410000024
GL=0
wherein the matrix G is
Figure FDA0002910970410000025
A row vector of (a);
then to
Figure FDA0002910970410000026
Matrix G ofTG, performing characteristic decomposition, arranging the characteristic values obtained by decomposition and the characteristic vectors corresponding to the characteristic values from small to large according to the characteristic values, and expressing as follows: { e1,v1},{e2,v2And { e } and3,v3}, minimum eigenvalue e1Corresponding feature vector v1Then the direction of incident illumination L.
5. The hyperspectral image and lidar data intrinsic hyperspectral point cloud generation method according to claim 4, characterized in that the specific steps of the fourth step are:
firstly, calculating the ith row and the jth column element of a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, wherein the ith row and the jth column element are expressed as follows:
Figure FDA0002910970410000027
then according to the matrix definition and MijObtaining a hyperspectral image-laser radar point cloud combined eigen decomposition matrix M, BjgRepresenting the jth row and jth column element of the supersubvoxel representation matrix B, Bjg,g=1,2,...,t,
Is provided with
Figure FDA0002910970410000028
Is a set of eigen-high spectral reflectivities of hyper-voxels, the eigen-high spectral reflectance of each hyper-voxel
Figure FDA0002910970410000029
The joint eigen decomposition relationship is:
Figure FDA00029109704100000210
solved to obtain
Figure FDA00029109704100000211
Then according to
Figure FDA00029109704100000212
Obtaining the intrinsic high spectral reflectivity of all point clouds
Figure FDA0002910970410000031
Wherein r iss=[rs1),rs2),...,rsd)]TFor the intrinsic high spectral reflectance of each point cloud, the intrinsic high spectral reflectance r of each point cloud is finally determineds=[rs1),rs2),...,rsd)]TSpatial coordinates [ x ] of the Stack to lidar Point cloud Ps,ys,zs]TThen generating an intrinsic hyperspectral point cloud
Figure FDA0002910970410000032
Wherein
Figure FDA0002910970410000033
The attribute features of each hyperspectral point cloud.
CN202110086571.1A 2021-01-22 2021-01-22 Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method Active CN112819959B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110086571.1A CN112819959B (en) 2021-01-22 2021-01-22 Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110086571.1A CN112819959B (en) 2021-01-22 2021-01-22 Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method

Publications (2)

Publication Number Publication Date
CN112819959A true CN112819959A (en) 2021-05-18
CN112819959B CN112819959B (en) 2022-03-04

Family

ID=75858945

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110086571.1A Active CN112819959B (en) 2021-01-22 2021-01-22 Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method

Country Status (1)

Country Link
CN (1) CN112819959B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408635A (en) * 2021-06-29 2021-09-17 哈尔滨工业大学 Hyperspectral image eigen decomposition method based on assistance of digital surface model
CN113936103A (en) * 2021-12-14 2022-01-14 星际空间(天津)科技发展有限公司 Method and equipment for constructing laser point cloud graph model
CN115331110A (en) * 2022-08-26 2022-11-11 苏州大学 Fusion classification method and device for remote sensing hyperspectral image and laser radar image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018045626A1 (en) * 2016-09-07 2018-03-15 深圳大学 Super-pixel level information fusion-based hyperspectral image classification method and system
CN108427913A (en) * 2018-02-05 2018-08-21 中国地质大学(武汉) The Hyperspectral Image Classification method of combined spectral, space and hierarchy information
CN109087341A (en) * 2018-06-07 2018-12-25 华南农业大学 A kind of fusion method of short distance EO-1 hyperion camera and distance measuring sensor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018045626A1 (en) * 2016-09-07 2018-03-15 深圳大学 Super-pixel level information fusion-based hyperspectral image classification method and system
CN108427913A (en) * 2018-02-05 2018-08-21 中国地质大学(武汉) The Hyperspectral Image Classification method of combined spectral, space and hierarchy information
CN109087341A (en) * 2018-06-07 2018-12-25 华南农业大学 A kind of fusion method of short distance EO-1 hyperion camera and distance measuring sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YANFENG GU等: "UAV-based integrated multispectral-LiDAR imaging system and data processing", 《SCIENCE CHINA TECHNOLOGICAL SCIENCES》 *
王青旺: "多高光谱图像和LiDAR数据联合分类方法研究", 《中国博士学位论文全文数据库 工程科技II辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408635A (en) * 2021-06-29 2021-09-17 哈尔滨工业大学 Hyperspectral image eigen decomposition method based on assistance of digital surface model
CN113936103A (en) * 2021-12-14 2022-01-14 星际空间(天津)科技发展有限公司 Method and equipment for constructing laser point cloud graph model
CN115331110A (en) * 2022-08-26 2022-11-11 苏州大学 Fusion classification method and device for remote sensing hyperspectral image and laser radar image

Also Published As

Publication number Publication date
CN112819959B (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN112819959B (en) Hyperspectral image and laser radar data intrinsic hyperspectral point cloud generation method
CN110009674B (en) Monocular image depth of field real-time calculation method based on unsupervised depth learning
CN110853075B (en) Visual tracking positioning method based on dense point cloud and synthetic view
Yurtseven et al. Determination and accuracy analysis of individual tree crown parameters using UAV based imagery and OBIA techniques
CN107204037B (en) Three-dimensional image generation method based on active and passive three-dimensional imaging system
CN112686935B (en) Airborne sounding radar and multispectral satellite image registration method based on feature fusion
CN110807828B (en) Oblique photography three-dimensional reconstruction matching method
CN112130169B (en) Point cloud level fusion method for laser radar data and hyperspectral image
CN103822616A (en) Remote-sensing image matching method with combination of characteristic segmentation with topographic inequality constraint
Youssefi et al. Cars: A photogrammetry pipeline using dask graphs to construct a global 3d model
CN113077552A (en) DSM (digital communication system) generation method and device based on unmanned aerial vehicle image
Moghaddam et al. A statistical variable selection solution for RFM ill-posedness and overparameterization problems
Bybee et al. Method for 3-D scene reconstruction using fused LiDAR and imagery from a texel camera
Parmehr et al. Automatic parameter selection for intensity-based registration of imagery to LiDAR data
CN113256696B (en) External parameter calibration method of laser radar and camera based on natural scene
Hong et al. Liv-gaussmap: Lidar-inertial-visual fusion for real-time 3d radiance field map rendering
CN112785693B (en) Method, system and device for generating intrinsic hyperspectral point cloud
CN116994029A (en) Fusion classification method and system for multi-source data
CN117197333A (en) Space target reconstruction and pose estimation method and system based on multi-view vision
CN116152800A (en) 3D dynamic multi-target detection method, system and storage medium based on cross-view feature fusion
CN116682105A (en) Millimeter wave radar and visual feature attention fusion target detection method
CN114359660B (en) Multi-modal target detection method and system suitable for modal intensity change
CN114565653A (en) Heterogeneous remote sensing image matching method with rotation change and scale difference
Gonçalves Using structure-from-motion workflows for 3D mapping and remote sensing
Kern et al. An accurate real-time Uav mapping solution for the generation of Orthomosaics and surface models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant