CN117314813B - Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium - Google Patents

Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium Download PDF

Info

Publication number
CN117314813B
CN117314813B CN202311615632.4A CN202311615632A CN117314813B CN 117314813 B CN117314813 B CN 117314813B CN 202311615632 A CN202311615632 A CN 202311615632A CN 117314813 B CN117314813 B CN 117314813B
Authority
CN
China
Prior art keywords
image
hyperspectral
hyperspectral image
images
band
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311615632.4A
Other languages
Chinese (zh)
Other versions
CN117314813A (en
Inventor
刘鸿飞
何智超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aopu Tiancheng Hunan Information Technology Co ltd
Original Assignee
Aopu Tiancheng Hunan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aopu Tiancheng Hunan Information Technology Co ltd filed Critical Aopu Tiancheng Hunan Information Technology Co ltd
Priority to CN202311615632.4A priority Critical patent/CN117314813B/en
Publication of CN117314813A publication Critical patent/CN117314813A/en
Application granted granted Critical
Publication of CN117314813B publication Critical patent/CN117314813B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Abstract

The invention provides a hyperspectral image wave band fusion method, a hyperspectral image wave band fusion system and a hyperspectral image wave band fusion medium, wherein the hyperspectral image wave band fusion method comprises the steps of obtaining a plurality of hyperspectral images corresponding to the same calibration graph, wherein the hyperspectral images respectively correspond to different wavelength ranges; obtaining the calibration graph coordinates of each hyperspectral image; according to the coordinates of the calibration graph, calculating and obtaining coordinates of the maximum overlapping areas of a plurality of hyperspectral images in each hyperspectral image; cutting out the maximum overlapping area in each hyperspectral image according to the coordinates of the maximum overlapping area in each hyperspectral image; resampling each cut hyperspectral image to obtain each resampled hyperspectral image; and carrying out band fusion on each hyperspectral image after resampling to obtain fused hyperspectral images. The invention can expand the band range of the hyperspectral image and improve the quality of the hyperspectral image.

Description

Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium
Technical Field
The invention relates to the technical field of remote control, in particular to a hyperspectral image wave band fusion method, a hyperspectral image wave band fusion system and a hyperspectral image wave band fusion medium.
Background
The remote sensing technology is a science and technology for acquiring the earth surface information by using a satellite, an aircraft, a unmanned aerial vehicle or other remote sensors. Hyperspectral remote sensing is one of the remote sensing technologies, and the hyperspectral remote sensing technology can provide abundant spectral information by acquiring surface reflection spectrum data through a sensor with a plurality of continuous narrow wave bands.
The existing hyperspectral imaging technology generally adopts a single camera for data acquisition, so that the wavelength range is narrow. The more the wave bands are, the more accurate and rich surface information can be obtained; the method can provide more comprehensive data support for monitoring, resource management and environmental analysis of the earth surface.
Disclosure of Invention
The present invention aims to solve at least to some extent one of the technical problems in the above-described technology. Therefore, the invention aims to provide a hyperspectral image wave band fusion method, a hyperspectral image wave band fusion system and a hyperspectral image wave band fusion medium, which can expand the wave band range of a hyperspectral image so as to acquire more accurate and richer image information.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a hyperspectral image band fusion method, including:
acquiring a plurality of hyperspectral images corresponding to the same calibration graph, wherein the hyperspectral images respectively correspond to different wavelength ranges;
obtaining the calibration graph coordinates of each hyperspectral image;
according to the coordinates of the calibration graph, calculating and obtaining coordinates of the maximum overlapping areas of a plurality of hyperspectral images in each hyperspectral image;
cutting out the maximum overlapping area in each hyperspectral image according to the coordinates of the maximum overlapping area in each hyperspectral image;
resampling each cut hyperspectral image to obtain each resampled hyperspectral image;
and carrying out band fusion on each hyperspectral image after resampling to obtain fused hyperspectral images.
According to the hyperspectral image wave band fusion method provided by the embodiment of the invention, the hyperspectral images corresponding to the same shooting object but corresponding to a plurality of different wavelength ranges can be subjected to wave band fusion, and the hyperspectral image with one wavelength range being the union of all the image wavelength ranges is obtained, so that the requirement of expanding the wavelength ranges of the hyperspectral images is met, and the quality of the hyperspectral images is greatly improved.
In addition, the hyperspectral image band fusion method provided by the embodiment of the invention can also have the following additional technical characteristics:
optionally, the acquiring scaled graphic coordinates of each hyperspectral image includes:
converting each hyperspectral image into a common image according to the pseudo-color wave band data corresponding to each hyperspectral image;
converting each common image into a gray scale image;
manufacturing a mask according to the positions of the calibration patterns in each gray level chart;
performing mask calculation on the image after mask making and the corresponding gray level image to obtain each mask image;
acquiring a binarization image corresponding to each mask image;
respectively obtaining the maximum outline in each binarized image, and obtaining a corresponding minimum outsourcing rectangle according to each maximum outline;
and defining the diagonal coordinates of the minimum outsourcing rectangle as the calibration graph coordinates of the corresponding hyperspectral image.
Optionally, calculating coordinates of a maximum overlapping area of the acquired hyperspectral images in each hyperspectral image according to the coordinates of the scaling graph, including:
taking the hyperspectral image with the minimum resolution ratio in the hyperspectral images as a reference image;
calculating the aspect ratio relation between the calibration graph in each hyperspectral image and the calibration graph in the reference image according to the calibration graph coordinates of each hyperspectral image, and obtaining the scaling factor of each hyperspectral image;
calculating the distance from the calibration graph to the boundary in each hyperspectral image according to the calibration graph coordinates of each hyperspectral image;
scaling each hyperspectral image according to the scaling factors, and calculating scaling boundary distances of four sides corresponding to each hyperspectral image;
respectively selecting minimum values of the scaling boundary distances of the four corresponding sides from the scaling boundary distances, and defining the minimum values as boundary distances of a maximum overlapping area;
and calculating coordinates of the maximum overlapping area corresponding to each hyperspectral image according to the boundary distance of the maximum overlapping area.
Optionally, the performing band fusion on each resampled hyperspectral image to obtain a fused hyperspectral image includes:
the overlapping wavelength of each hyperspectral image after resampling is used for calculating the single-band image noise of each corresponding hyperspectral image by using a half variance function;
counting the number of wave bands belonging to low noise according to the single-wave band image noise;
if the number of the wave bands is smaller than or equal to a threshold value, taking the single-band image with the smallest noise as the single-band image corresponding to the overlapped wavelength;
if the number of the wave bands is larger than a threshold value, weighting and fusing the low-noise single-wave band images into a single-wave band image, and taking the fused single-wave band image as the single-wave band image corresponding to the overlapped wavelength;
acquiring each hyperspectral image of each wavelength uniquely corresponding to one wave band image;
and merging the obtained hyperspectral images to obtain the fused hyperspectral image.
Optionally, the calculating noise of the single-band image corresponding to each hyperspectral image by using the half variance function includes:
respectively constructing pixel values of each hyperspectral image corresponding to an overlapping wavelength into corresponding one-dimensional vectors according to a line priority principle;
respectively combining every two pixels in each one-dimensional vector into a pixel pair, and calculating the distance between the pixel pairs by using the Manhattan distance;
calculating the square difference between two pixels in the pixel pair as the difference of the corresponding pixel pair;
grouping pixel pairs in each one-dimensional vector according to the distance, and calculating the average difference of each grouping according to the difference;
determining single-band image noise of the hyperspectral image according to the proportion of the grouping with average difference exceeding a difference threshold value in the corresponding one-dimensional vector total grouping;
and acquiring single-band image noise of each hyperspectral image.
Optionally, taking a hyperspectral image with the smallest resolution in the hyperspectral images as a reference image, and resampling each hyperspectral image after clipping by taking the resolution of the reference image as a reference.
Optionally, the scaling pattern is a scaling rectangle.
To achieve the above object, an embodiment of a second aspect of the present invention provides a hyperspectral image band fusion system, including a computer readable storage medium and a processor; the computer readable storage medium stores a computer program which, when executed by a processor, enables the implementation of the steps involved in the above-described hyperspectral image band fusion method.
According to the hyperspectral image wave band fusion system provided by the embodiment of the invention, the hyperspectral images corresponding to the same shooting object but corresponding to a plurality of different wavelength ranges can be subjected to wave band fusion, and hyperspectral images with one wavelength range being the union of all the image wavelength ranges are obtained, so that the requirement of expanding the wavelength ranges of the hyperspectral images is met, and the quality of the hyperspectral images is greatly improved.
To achieve the above object, an embodiment of a third aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, is capable of implementing the steps involved in the above-mentioned hyperspectral image band fusion method.
According to the computer readable storage medium, after being processed, the computer program can perform band fusion on the hyperspectral images corresponding to the same shooting object but corresponding to a plurality of different wavelength ranges, and obtain the hyperspectral image with one wavelength range being the union of all the image wavelength ranges, so that the requirement of expanding the wavelength ranges of the hyperspectral image is met, and the quality of the hyperspectral image is greatly improved.
Drawings
Fig. 1 is a schematic flow chart of a hyperspectral image band fusion method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a very large overlap region in an embodiment of the present invention;
FIGS. 3 (a) - (c) are schematic diagrams of the distance from the scaled rectangle to the image boundary in the reference image img_base, image img1 and image img2, respectively, according to embodiments of the invention;
fig. 4 is a schematic flow chart of band fusion of resampled hyperspectral images to obtain fused hyperspectral images according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
Aiming at the defect of low image quality caused by narrower image wavelength range in the existing single-camera acquisition hyperspectral imaging technology. The method and the device can perform band fusion on the hyperspectral images corresponding to the same shooting object but corresponding to a plurality of different wavelength ranges to obtain the hyperspectral image with one wavelength range being the union of all the image wavelength ranges, thereby greatly improving the quality of the hyperspectral image.
In order that the above-described aspects may be better understood, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
Fig. 1 is a schematic flow chart of a hyperspectral image band fusion method according to an embodiment of the present invention. As shown in fig. 1, the method for fusing hyperspectral image bands provided in the embodiment of the present invention at least includes the following steps S1 to S6.
In step S1, a plurality of hyperspectral images corresponding to the same calibration pattern are obtained, wherein the hyperspectral images respectively correspond to different wavelength ranges.
That is, the plurality of hyperspectral images are acquired by photographing the same calibration pattern with a plurality of hyperspectral cameras having different wavelength ranges. Wherein, the calibration graph is preferably a calibration rectangular image, which is more convenient for calculation. Preferably, the total wavelength range covered by the plurality of hyperspectral images will be as wide as possible, so that the wavelength range corresponding to the hyperspectral image obtained by band fusion is correspondingly wider. Of course, flexible configuration of the range of bands covered by the several hyperspectral images for the band fusion process is also supported depending on the accuracy requirements or computer processing power of the final image.
In step S2, scaled graphic coordinates of each hyperspectral image are acquired. I.e. the coordinates of the calibration pattern in each hyperspectral image are located.
In some embodiments of the present embodiment, the step S2 may be specifically implemented by:
s21: converting each hyperspectral image into a common image according to the pseudo-color wave band data corresponding to each hyperspectral image;
it can be understood that the data file generated after the hyperspectral image is acquired by the camera records three pseudo-color wave band data corresponding to the hyperspectral image; by extracting the three band data from the hyperspectral image, conversion of the hyperspectral image into a normal image will be enabled.
S22: converting each common image into a gray scale image;
s23: manufacturing a mask according to the positions of the calibration patterns in each gray level chart;
it will be appreciated that the mask acts to mask image information outside of the scaled pattern, preserving the desired scaled pattern area.
S24: performing mask calculation on the image after mask making and the corresponding gray level image to obtain each mask image;
it can be understood that the mask image can be obtained by performing bitwise and operation on the gray scale image and the image after mask fabrication.
S25: acquiring a binarization image corresponding to each mask image;
the mask image can be subjected to self-defined threshold binarization to obtain a corresponding binarized image.
S26: respectively obtaining the maximum outline in each binarized image, and obtaining a corresponding minimum outsourcing rectangle according to each maximum outline;
it can be appreciated that by performing a find contour operation on the binarized image, the largest contour therein can be preserved; and the maximum outline is expanded into a rectangle so as to facilitate calculation and acquisition of coordinates thereof. Here, a minimum bounding rectangle of the maximum outline is preferable as the expanded rectangle. Only as one of the expansion modes, the circumscribed rectangle of the maximum outline, namely the minimum outsourcing rectangle, can be obtained through an opencv library.
Preferably, in the case that the calibration pattern is rectangular, and the extremely overlapped area of the camera is also rectangular, the maximum contour is expanded to be rectangular based on the basis of being rectangular, which is more beneficial to the calculation process.
S27: and defining the diagonal coordinates of the minimum outsourcing rectangle as the calibration graph coordinates of the corresponding hyperspectral image.
Wherein the diagonal coordinates may be upper left and lower right coordinates of a rectangle; of course, the lower left and upper right corner coordinates of a rectangle are also possible.
Therefore, the coordinates of the calibration graph in each hyperspectral image can be found out.
In step S3, coordinates of the maximum overlapping areas of the plurality of hyperspectral images in each hyperspectral image are calculated and acquired according to the coordinates of the calibration graph.
Wherein the overlapping region represents the portion of all hyperspectral images where there is a certain scene image, such as the calibration graph described previously; the maximum overlap region refers to the portion of the largest scene image among all hyperspectral images. As shown in fig. 2, the middle rectangle is a scaled rectangle, and the overlapping region of the hyperspectral image img1 and the hyperspectral image img2 is a maximum overlapping region.
In this embodiment, the actual coordinates of the very large overlap area between the several hyperspectral images in each image will be back-deduced by scaling the graphic coordinates.
In some embodiments of the present embodiment, the step S3 may be specifically implemented by:
s31: taking the hyperspectral image with the minimum resolution ratio in the hyperspectral images as a reference image;
s32: calculating the aspect ratio relation between the calibration graph in each hyperspectral image and the calibration graph in the reference image according to the calibration graph coordinates of each hyperspectral image, and obtaining the scaling factor of each hyperspectral image;
s33: calculating the distance from the calibration graph to the boundary in each hyperspectral image according to the calibration graph coordinates of each hyperspectral image;
s34: and scaling each hyperspectral image according to the scaling factors, keeping the corresponding image size of the same area in each image consistent after scaling, and calculating the scaling boundary distances from the scaling graph in each hyperspectral image to four edges of the corresponding extremely-large overlapping area. As shown in fig. 2, the scaling boundary distances are h_res, i_res, r_res, and b_res in the figure.
It will be appreciated that the size of the scaled rectangle in each hyperspectral image will be registered with the size of the scaled rectangle in the reference image so that the boundary distances of all hyperspectral images remain the same metric.
S35: respectively selecting minimum values of the scaling boundary distances of the four corresponding sides from the scaling boundary distances, and defining the minimum values as boundary distances of a maximum overlapping area;
s36: and calculating coordinates of the maximum overlapping area corresponding to each hyperspectral image according to the boundary distance of the maximum overlapping area.
In step S4, the maximum overlapping area in each hyperspectral image is clipped according to the coordinates of the maximum overlapping area in each hyperspectral image.
Optionally, the cut out extremely large overlapping area can be stored locally, which is not only beneficial to releasing the memory of the processor, but also avoids causing a jam or a dead halt; and meanwhile, the subsequent operation is convenient. Because hyperspectral images are large and occupy memory, sometimes the memory limitation of a computer cannot be processed at one time, so that the oversized images to be processed can be stored locally for subsequent operation.
In step S5, resampling is performed on each of the cut hyperspectral images, and each of the resampled hyperspectral images is obtained.
The purpose of resampling is here to maintain consistency of the spatial dimensions.
Based on the characteristics of multiple bands, high spectral resolution and extremely large data volume of the hyperspectral image. Under the limitation of limited memory of a computer, only one band or multiple bands can be processed at a time. In this embodiment, it is preferable to perform the single-band processing, and to use the resolution of the reference image (i.e., the hyperspectral image with the smallest resolution as a reference), and to use the resampling method based on local pixels to keep all the images at the same resolution, i.e., to keep the consistency of the spatial dimension.
And in step S6, carrying out band fusion on each hyperspectral image after resampling to obtain fused hyperspectral images.
Because the wavelengths of the hyperspectral images have overlapping parts, and some wavelengths may have a case that one wavelength corresponds to a single-band image of a plurality of images, a decision mode is needed to select a band of a certain image, or a fusion mode is needed to fuse overlapping bands of a plurality of images, so that the wavelengths correspond to the bands one by one.
In the present embodiment, noise may be used as a criterion in consideration of the influence of noise on the analysis of hyperspectral image data. Optionally, selecting the image band with the least noise in a decision mode, or fusing a plurality of image bands with less noise into a single-band image in a fusion mode. The band fusion process will be described in detail later by means of specific embodiments.
According to the hyperspectral image wave band fusion method, the calibration graph coordinates in each image are positioned sequentially, the maximum overlapping area of each image is reversely deduced according to the calibration graph coordinates, the maximum overlapping area is cut, the cut images are resampled, and the resampled image wave bands are fused, so that hyperspectral images corresponding to a plurality of different wavelength ranges can be automatically fused into hyperspectral images with one wavelength range being the union of all image wavelengths, the wavelength range of the hyperspectral images is expanded, the quality of the finally obtained hyperspectral images is greatly improved, and more accurate and richer image information can be represented.
Referring to fig. 3 (a) - (c), fig. 3 (a) - (c) are schematic diagrams of the distances from the scaled rectangle to the image boundary in the reference image img_base, the image img1 and the image img2 according to the embodiment of the present invention, respectively.
This embodiment is based on the embodiment of fig. 1, and step S3 is further detailed, and specifically described in conjunction with a development of a specific embodiment.
In this embodiment, a calibration pattern is described as an example of a calibration rectangle; meanwhile, a hyperspectral image with the minimum resolution in a plurality of hyperspectral images is defined as a reference image and is marked as img_base, and other images are respectively marked as img1, img2 and the like.
(1) A scaling factor is calculated for each hyperspectral image.
Coordinates of the scaled rectangle in the reference image img_base, i.e. the sitting marks of the scaled rectangle in the upper left and lower right corners of the reference image img_base are: (x_b1, y_b1), (x_b2, y_b2);
coordinates of the scaled rectangle in the image img1, i.e. the sitting marks of the scaled rectangle in the upper left and lower right corners of the image img1 are: (x 1, y 1), (x 2, y 2);
coordinates of the scaled rectangle in the image img2, i.e. the sitting marks of the scaled rectangle in the upper left and lower right corners of the image img2 are: (x 3, y 3), (x 4, y 4);
calculating a scaling factor between the image img1 and the reference image img_base from coordinates (x_b1, y_b1), (x_b2, y_b2), (x 1, y 1), (x 2, y 2) of the scaling rectangle in the reference image img_base and the image img 1:
factor1_w =(x2-x1) /(x_b2-x_b1);
factor1_h =(y2-y1) /(y_b2-y_b1);
calculating a scaling factor between the image img2 and the reference image img_base from coordinates (x_b1, y_b1), (x_b2, y_b2), (x 3, y 3), (x 4, y 4) of the scaling rectangle in the reference image img_base and the image img 2:
factor2_w =(x4-x3) /(x_b2-x_b1);
factor2_h =(y4-y3)/(y_b2-y_b1);
and so on, a scaling factor between each hyperspectral image and the reference image img_base is obtained.
(2) The distance from the calibration pattern to the boundary in each hyperspectral image is calculated.
As shown in fig. 3 (a) - (c), the distance from the scaled rectangle to the boundary in the reference image img_base is:
base_h=y_b1;
base_b=img_base_h-y_b2;
base_l=x_b1;
base_r=img_base_w-x_b2;
the distance from the scaled rectangle to the boundary in image img1 is:
h1=y1;
b1=img1_h-y2;
l1=x1;
r1=img1_w-x2;
the distance from the scaled rectangle to the boundary in image img2 is:
h2=y3;
b2=img2_h-y4;
l2=x3;
r2=img2_w-x4。
h, b, l and r above represent the distances to the boundary of the scaled rectangle up, down, left and right, respectively.
Similarly, the distance from the scaled rectangle to the boundary in each hyperspectral image is calculated.
(3) And scaling each hyperspectral image according to the corresponding scaling factors, and calculating the scaling boundary distances from the scaling rectangle to the four edges corresponding to the maximum overlapping area in each hyperspectral image.
The scaled boundary distance formula for image img1 is:
h1_t=h1*factor1_h;
b1_t=b1*factor1_h;
l1_1t=l1*factor1_w;
r1_1t=r1*factor1_w;
the scaled boundary distance formula for image img2 is:
h2_t=h2*factor2_h;
b2_t=b2*factor2_h;
l2_t=l2*factor2_w;
r2_t=r2*factor2_w;
and so on, obtaining the zoom boundary distance of each hyperspectral image.
(4) The boundary distance that is the smallest scaled boundary distance is selected from all the hyperspectral images as the boundary distance of the largest overlapping region.
The boundary distance calculation formula of the maximum overlap area is as follows:
h_res=min(base_h,h1_t,h2_t...);
b_res=min(base_b,b1_t,b2_t...);
l_res=min(base_l,l1_t,l2_t...);
r_res=min(base_r,r1_t,r2_t...)。
(5) And calculating coordinates of the maximum overlapping region in each hyperspectral image according to the boundary distance of the maximum overlapping region.
The actual sitting marks of the extremely large overlapping areas in the reference image are (xb 1, yb 1), (xb 2, yb 2), and the calculation formulas are as follows:
xb1=x_b1-l_res;
yb1=y_b1-h_res;
xb2=x_b2+r_res;
yb2=y_b2+b_res;
when calculating the actual coordinates of the extremely large overlapping areas in other hyperspectral images, transformation is needed according to the corresponding scaling factors:
the actual coordinates of the extremely large overlap region in the image img1 are (x 11, y 11), (x 12, y 12), and the calculation formula is:
x11=x1-l_res;
y11=y1-h_res;
x12=x2+r_res;
y12=y2+b_res;
the actual coordinates of the extremely large overlap region in the image img2 are (x 21, y 21), (x 22, y 22), and the calculation formula is:
x21=x3-l_res;
y21=y3-h_res;
x22=x4+r_res;
y22=y4+b_res;
and so on, obtaining the actual coordinates of the extremely large overlapping area in each hyperspectral image.
Referring to fig. 4, fig. 4 is a schematic flow chart of performing band fusion on each resampled hyperspectral image to obtain a fused hyperspectral image according to an embodiment of the present invention. The embodiment is further developed on the basis of any embodiment. As shown in fig. 4, performing band fusion on each resampled hyperspectral image to obtain a fused hyperspectral image, including:
s61: the overlapping wavelength of each hyperspectral image after resampling is used for calculating the single-band image noise of each corresponding hyperspectral image by using a half variance function;
s62: counting the number of wave bands belonging to low noise according to the single-wave band image noise; the definition of low noise is opposite, and the division can be performed by presetting a noise level judgment threshold.
S63: if the number n of the low-noise wave bands is smaller than or equal to a threshold value, for example, n < = 1, taking the single-band image with the smallest noise as the single-band image with the overlapped wavelength;
s64: if the number n of the low-noise wave bands is larger than a threshold value, if n is larger than 1, weighting and fusing all the acquired low-noise single-band images into a single-band image, wherein the weighting and fusing weight is 1/n, and taking the single-band image obtained by weighting as the single-band image with the overlapped wavelength;
s65: acquiring each hyperspectral image of each wavelength uniquely corresponding to one wave band image; it will be appreciated that the above provides a process of band fusion of overlapping wavelengths, with the different overlapping wavelengths being processed independently.
S66: and merging the obtained hyperspectral images to obtain the fused hyperspectral image.
The following will describe the above embodiments in detail with reference to a specific embodiment.
(1) And respectively constructing pixel values of each hyperspectral image corresponding to an overlapping wavelength into corresponding one-dimensional vectors according to a line priority principle. It will be appreciated that for an overlapping wavelength, a one-dimensional vector is constructed that corresponds to each hyperspectral image. For example, the overlapping wavelength is 900nm, one hyperspectral image is 300 pixels wide and 500 pixels high, and the corresponding one-dimensional vector is 300×500=150000 pixels.
(2) And respectively combining every two pixels in each one-dimensional vector into a pixel pair, and calculating the distance between the pixel pairs by using the Manhattan distance. That is, each one-dimensional vector is processed separately to calculate the distance between pixel pairs.
(3) Calculating the square difference between two pixels in the pixel pair as the difference of the corresponding pixel pair;
(4) And respectively grouping the pixel pairs in each one-dimensional vector according to the distance, and calculating the average difference of each grouping according to the difference. That is, pixels in each one-dimensional vector are grouped by distance.
(5) And determining the noise of a single-band image of the hyperspectral image corresponding to the current overlapped wavelength according to the duty ratio of the grouping of which the average difference exceeds a preset difference threshold value in the total grouping of the corresponding one-dimensional vectors.
Here, by setting a difference threshold, a packet exceeding the difference threshold is regarded as noise; and simultaneously calculating the ratio of the number of 'noise' groups to the total number of groups (in the same one-dimensional vector), wherein the smaller the ratio is, the smaller the noise of the overlapped wavelength corresponding to the image is. Optionally, a ratio threshold may be preset, and if the ratio is lower than the ratio threshold, the noise is considered to be smaller; otherwise, the noise is larger.
(6) Through the method, the single-band image noise of each hyperspectral image is obtained.
It will be appreciated that the same wavelength may correspond to multiple bands. Here, it is necessary to acquire noise of all band images corresponding to the respective hyperspectral images, respectively, for each superimposed wavelength. For example, the assumed overlapping wavelength 900nm corresponds to a single-band image in the 3 high-light-spectrum images, and the single-band image noise corresponding to the 3 images is obtained through the calculation.
Based on any of the above embodiments, another embodiment of the present invention further provides a hyperspectral image band fusion system, including a computer readable storage medium and a processor; the computer readable storage medium stores a computer program which, when executed by a processor, can implement the steps included in the hyperspectral image band fusion method according to any one of the embodiments. The specific steps are not repeated here, and reference is made to the description of the above embodiments for details.
Based on any of the above embodiments, another embodiment of the present invention further provides a computer readable storage medium, on which a computer program is stored, where the program, when executed by a processor, can implement the steps included in a hyperspectral image band fusion method according to any of the embodiments. The specific steps are not repeated here, and reference is made to the description of the above embodiments for details.
The hyperspectral image wave band fusion method, the hyperspectral image wave band fusion system and the hyperspectral image wave band fusion medium are suitable for a portable hyperspectral camera, a fixed hyperspectral camera or other hyperspectral cameras, and after a plurality of hyperspectral images are collected by the hyperspectral cameras with different wavelength ranges, the hyperspectral images with one wavelength range being the union of all the camera wavelength ranges can be automatically fused according to the embodiment of the invention, so that the quality of the hyperspectral images is greatly improved, and more accurate and richer image information is obtained.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
In the description of the present invention, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present invention, unless expressly stated or limited otherwise, a first feature "up" or "down" a second feature may be the first and second features in direct contact, or the first and second features in indirect contact via an intervening medium. Moreover, a first feature being "above," "over" and "on" a second feature may be a first feature being directly above or obliquely above the second feature, or simply indicating that the first feature is level higher than the second feature. The first feature being "under", "below" and "beneath" the second feature may be the first feature being directly under or obliquely below the second feature, or simply indicating that the first feature is less level than the second feature.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms should not be understood as necessarily being directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (8)

1. A method for band fusion of hyperspectral images, comprising:
acquiring a plurality of hyperspectral images corresponding to the same calibration graph, wherein the hyperspectral images respectively correspond to different wavelength ranges;
obtaining the calibration graph coordinates of each hyperspectral image;
according to the coordinates of the calibration graph, calculating and obtaining coordinates of the maximum overlapping areas of a plurality of hyperspectral images in each hyperspectral image;
cutting out the maximum overlapping area in each hyperspectral image according to the coordinates of the maximum overlapping area in each hyperspectral image;
resampling each cut hyperspectral image to obtain each resampled hyperspectral image;
performing band fusion on each resampled hyperspectral image to obtain fused hyperspectral images;
the calculating the coordinates of the maximum overlapping area of the hyperspectral images in each hyperspectral image according to the coordinates of the calibration graph comprises the following steps:
taking the hyperspectral image with the minimum resolution ratio in the hyperspectral images as a reference image;
calculating the aspect ratio relation between the calibration graph in each hyperspectral image and the calibration graph in the reference image according to the calibration graph coordinates of each hyperspectral image, and obtaining the scaling factor of each hyperspectral image;
calculating the distance from the calibration graph to the boundary in each hyperspectral image according to the calibration graph coordinates of each hyperspectral image;
scaling each hyperspectral image according to the scaling factors, and calculating scaling boundary distances from scaling rectangles in each hyperspectral image to four sides corresponding to the maximum overlapping area according to the distances from the scaling graphics in each hyperspectral image to the boundary and the scaling factors of each hyperspectral image;
respectively selecting minimum values of the scaling boundary distances of the four corresponding sides from the scaling boundary distances, and defining the minimum values as boundary distances of a maximum overlapping area;
and calculating coordinates of the maximum overlapping area corresponding to each hyperspectral image according to the boundary distance of the maximum overlapping area.
2. A method of band fusion of hyperspectral images as claimed in claim 1 wherein the obtaining of scaled graphic coordinates for each hyperspectral image comprises:
converting each hyperspectral image into a common image according to the pseudo-color wave band data corresponding to each hyperspectral image;
converting each common image into a gray scale image;
manufacturing a mask according to the positions of the calibration patterns in each gray level chart;
performing mask calculation on the image after mask making and the corresponding gray level image to obtain each mask image;
acquiring a binarization image corresponding to each mask image;
respectively obtaining the maximum outline in each binarized image, and obtaining a corresponding minimum outsourcing rectangle according to each maximum outline;
and defining the diagonal coordinates of the minimum outsourcing rectangle as the calibration graph coordinates of the corresponding hyperspectral image.
3. The method for band fusion of hyperspectral images as claimed in claim 1, wherein the step of band fusion of the resampled hyperspectral images to obtain the fused hyperspectral images comprises the steps of:
the overlapping wavelength of each hyperspectral image after resampling is used for calculating the single-band image noise of each corresponding hyperspectral image by using a half variance function;
counting the number of wave bands belonging to low noise according to the single-wave band image noise;
if the number of the wave bands is smaller than or equal to a threshold value, taking the single-band image with the smallest noise as the single-band image corresponding to the overlapped wavelength;
if the number of the wave bands is larger than a threshold value, weighting and fusing the low-noise single-wave band images into a single-wave band image, and taking the fused single-wave band image as the single-wave band image corresponding to the overlapped wavelength;
acquiring each hyperspectral image of each wavelength uniquely corresponding to one wave band image;
and merging the obtained hyperspectral images to obtain the fused hyperspectral image.
4. A method of band fusion of hyperspectral images as claimed in claim 3 wherein the calculating of the single band image noise of each hyperspectral image using the half variance function comprises:
respectively constructing pixel values of each hyperspectral image corresponding to an overlapping wavelength into corresponding one-dimensional vectors according to a line priority principle;
respectively combining every two pixels in each one-dimensional vector into a pixel pair, and calculating the distance between the pixel pairs by using the Manhattan distance;
calculating the square difference between two pixels in the pixel pair as the difference of the corresponding pixel pair;
grouping pixel pairs in each one-dimensional vector according to the distance, and calculating the average difference of each grouping according to the difference;
determining single-band image noise of the hyperspectral image according to the proportion of the grouping with average difference exceeding a difference threshold value in the corresponding one-dimensional vector total grouping;
and acquiring single-band image noise of each hyperspectral image.
5. A hyperspectral image band fusion method as claimed in claim 1 wherein the hyperspectral images with the smallest resolution of the hyperspectral images are taken as reference images, and the cut hyperspectral images are resampled with the resolution of the reference images as reference.
6. A method of band fusion of hyperspectral images as claimed in claim 1 wherein the scaled pattern is a scaled rectangle.
7. A hyperspectral image band fusion system, comprising a computer readable storage medium and a processor; the computer readable storage medium has stored thereon a computer program which, when executed by a processor, is capable of carrying out the steps comprised in a hyperspectral image band fusion method as claimed in any one of the preceding claims 1 to 6.
8. A computer readable storage medium having stored thereon a computer program, which when executed by a processor is capable of performing the steps comprised in a hyperspectral image waveband fusion method as claimed in any one of the preceding claims 1 to 6.
CN202311615632.4A 2023-11-30 2023-11-30 Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium Active CN117314813B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311615632.4A CN117314813B (en) 2023-11-30 2023-11-30 Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311615632.4A CN117314813B (en) 2023-11-30 2023-11-30 Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium

Publications (2)

Publication Number Publication Date
CN117314813A CN117314813A (en) 2023-12-29
CN117314813B true CN117314813B (en) 2024-02-13

Family

ID=89285214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311615632.4A Active CN117314813B (en) 2023-11-30 2023-11-30 Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium

Country Status (1)

Country Link
CN (1) CN117314813B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230281A (en) * 2016-12-30 2018-06-29 北京市商汤科技开发有限公司 Remote sensing image processing method, device and electronic equipment
CN108364277A (en) * 2017-12-20 2018-08-03 南昌航空大学 A kind of infrared small target detection method of two-hand infrared image fusion
CN109886351A (en) * 2019-03-04 2019-06-14 北京麦飞科技有限公司 High-spectral data and high-definition picture fusion method
CN110706188A (en) * 2019-09-23 2020-01-17 北京航天宏图信息技术股份有限公司 Image fusion method and device, electronic equipment and storage medium
CN110869976A (en) * 2018-12-04 2020-03-06 深圳市大疆创新科技有限公司 Image processing method, device, unmanned aerial vehicle, system and storage medium
CN116579959A (en) * 2023-04-13 2023-08-11 北京邮电大学 Fusion imaging method and device for hyperspectral image

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8078009B2 (en) * 2008-07-08 2011-12-13 Harris Corporation Optical flow registration of panchromatic/multi-spectral image pairs
JP6746310B2 (en) * 2015-05-22 2020-08-26 キヤノン株式会社 Image processing device, imaging system, and image processing method
CN109472199B (en) * 2018-09-29 2022-02-22 深圳大学 Image fusion classification method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230281A (en) * 2016-12-30 2018-06-29 北京市商汤科技开发有限公司 Remote sensing image processing method, device and electronic equipment
CN108364277A (en) * 2017-12-20 2018-08-03 南昌航空大学 A kind of infrared small target detection method of two-hand infrared image fusion
CN110869976A (en) * 2018-12-04 2020-03-06 深圳市大疆创新科技有限公司 Image processing method, device, unmanned aerial vehicle, system and storage medium
CN109886351A (en) * 2019-03-04 2019-06-14 北京麦飞科技有限公司 High-spectral data and high-definition picture fusion method
CN110706188A (en) * 2019-09-23 2020-01-17 北京航天宏图信息技术股份有限公司 Image fusion method and device, electronic equipment and storage medium
CN116579959A (en) * 2023-04-13 2023-08-11 北京邮电大学 Fusion imaging method and device for hyperspectral image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Image Fusion Using the Ehlers Spectral Characteristics Preservation Algorithm;Sascha Klonus 等;《GIScience & Remote Sensing》 *
基于特征级融合的多波段舰船目标识别方法;刘峰 等;《光谱学与光谱分析》 *

Also Published As

Publication number Publication date
CN117314813A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
KR101921672B1 (en) Image processing method and device
JP5432714B2 (en) Composition analysis method, image apparatus having composition analysis function, composition analysis program, and computer-readable recording medium
JP7002056B2 (en) 3D model generator and 3D model generation method
CN110009561B (en) Method and system for mapping surveillance video target to three-dimensional geographic scene model
CN107909640B (en) Face relighting method and device based on deep learning
CN111461319B (en) CNN-based object detection method and device capable of adapting to user requirements
JP4696856B2 (en) Image processing apparatus, image processing method, program thereof, and computer-readable recording medium recording the program
CN102970504A (en) Image projecting device, image processing device, image projecting method, and computer-readable recording medium
CN102881014A (en) Quick stereo matching method based on graph cut
CN106228579A (en) A kind of video image dynamic water table information extracting method based on geographical space-time scene
US20200388048A1 (en) Using spatial filter to reduce bundle adjustment block size
CN104036468A (en) Super-resolution reconstruction method for single-frame images on basis of pre-amplification non-negative neighbor embedding
CN113033553B (en) Multi-mode fusion fire detection method, device, related equipment and storage medium
CN110738731A (en) 3D reconstruction method and system for binocular vision
CN116778288A (en) Multi-mode fusion target detection system and method
CN117314813B (en) Hyperspectral image wave band fusion method, hyperspectral image wave band fusion system and hyperspectral image wave band fusion medium
Fryskowska et al. Some aspects of satellite imagery integration from Eros B and Landsat 8
US20210124975A1 (en) System using image connectivity to reduce bundle size for bundle adjustment
US11868377B2 (en) Systems and methods for providing geodata similarity
KR102411499B1 (en) Small river flow measurement device, system and method therefor using drone and small river automatic flow measurement technology
CN114972030A (en) Image splicing method and device, storage medium and electronic equipment
JP2020140497A (en) Calculation device, and parallax calculation method
CN114565764A (en) Port panorama sensing system based on ship instance segmentation
CN113378818A (en) Electrical equipment defect determining method and device, electronic equipment and storage medium
KR20220144237A (en) Real-time Rainfall Prediction Device using Cloud Images, and Rainfall Prediction Method using the same, and a computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant