CN108681710B - Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method - Google Patents

Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method Download PDF

Info

Publication number
CN108681710B
CN108681710B CN201810471040.2A CN201810471040A CN108681710B CN 108681710 B CN108681710 B CN 108681710B CN 201810471040 A CN201810471040 A CN 201810471040A CN 108681710 B CN108681710 B CN 108681710B
Authority
CN
China
Prior art keywords
image
brightness value
evidence
fusion
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810471040.2A
Other languages
Chinese (zh)
Other versions
CN108681710A (en
Inventor
高昆
赵天择
王静
华梓铮
王广平
周颖婕
吴穹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Beijing Institute of Environmental Features
Original Assignee
Beijing Institute of Technology BIT
Beijing Institute of Environmental Features
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT, Beijing Institute of Environmental Features filed Critical Beijing Institute of Technology BIT
Priority to CN201810471040.2A priority Critical patent/CN108681710B/en
Publication of CN108681710A publication Critical patent/CN108681710A/en
Application granted granted Critical
Publication of CN108681710B publication Critical patent/CN108681710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention provides a ship identification method and a ship identification device under a sea-sky background based on a broadband-hyperspectral infrared image fusion method, and relates to the technical field of object identification, wherein the object identification method comprises the following steps: acquiring a first image and a second image, wherein the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration; extracting spectral features of a hyperspectral infrared image from the first image, and extracting infrared features of a broadband infrared image from the second image; and fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized. The invention solves the technical problem that the existing ship detection technology cannot quickly obtain a high-precision identification result.

Description

Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method
Technical Field
The invention relates to the technical field of object identification, in particular to a ship identification method and a ship identification device under a sea-sky background based on a broadband-hyperspectral infrared image fusion method.
Background
The identification of objects has important significance in various fields such as security protection, exploration and the like. For example, the identification of the sea surface ship target is an important link of the national ocean monitoring task, and has very important significance for the national economic development, the environmental protection, the ocean equity maintenance and the development of military strength.
Nowadays, offshore ship detection and dynamic monitoring applications put higher demands on the accuracy of related monitoring and control systems. The existing ship detection technology, the high-resolution geosynchronous orbit remote sensing technology is still in the development stage, and the research is not deeply carried out in the aspect of monitoring and monitoring application of offshore ships. In terms of the detection method, the infrared target detection method mainly adopts the idea of combining single-frame detection and multi-frame confirmation, the detection and comparison process is time-consuming, and a high-precision first identification result cannot be quickly obtained.
Aiming at the technical problem that the existing ship detection technology can not quickly obtain a high-precision identification result, an effective solution is lacked at present.
Disclosure of Invention
In view of the above, the present invention provides a ship identification method and apparatus based on a broadband-hyperspectral infrared image fusion method in a sea-sky background, so as to alleviate the technical problem that the existing ship detection technology cannot obtain a high-precision identification result quickly.
In a first aspect, an embodiment of the present invention provides a method for identifying a ship in a sea-sky background based on a broadband-hyperspectral infrared image fusion method, including:
acquiring a first image and a second image, wherein the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration;
extracting spectral features of the hyperspectral infrared image from the first image, and extracting infrared features of the broadband infrared image from the second image;
and fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where fusing the spectral feature and the infrared feature based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized, including:
obtaining basic probability assignments of the spectral features corresponding to the target identification objects by using a gray theory as a first evidence, and obtaining the basic probability assignments of the infrared features corresponding to the target identification objects by using the gray theory as a second evidence;
calculating a basic probability assignment function and a confidence interval of each evidence after the first evidence and the second evidence are subjected to fusion processing by using a Dempster combination rule;
and selecting the processed evidence with the maximum support degree from the processed evidences according to the basic probability assignment function and the confidence degree interval of the processed evidences, and determining the processed evidence with the maximum support degree as a first recognition result of the scene to be recognized.
With reference to the first aspect, an embodiment of the present invention provides a second possible implementation manner of the first aspect, where the object identification method further includes:
fusing the first image and the second image to obtain a fusion result;
identifying the scene to be identified according to the fusion result to obtain a second identification result;
and combining the first recognition result and the second recognition result to recognize and confirm the scene to be recognized.
With reference to the second possible implementation manner of the first aspect, an embodiment of the present invention provides a third possible implementation manner of the first aspect, where fusing the first image and the second image to obtain a fusion result includes:
performing histogram equalization stretching processing on the first image to convert the brightness value of the first image into an equalized brightness value to obtain a first equalized brightness value;
performing histogram equalization stretching processing on the second image to convert the brightness value of the second image into an equalized brightness value to obtain a second equalized brightness value;
and fusing the first equalized brightness value and the second equalized brightness value to obtain the fusion result.
With reference to the second possible implementation manner of the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where fusing the first equalized luminance value and the second equalized luminance value to obtain the fusion result includes:
weighting and summing the first equalized brightness value and the second equalized brightness value to obtain a fused brightness value matrix, and determining an image represented by the fused brightness value matrix as a fused image;
and performing histogram restoration on the fused image to obtain the fusion result.
In a second aspect, an embodiment of the present invention further provides a ship identification apparatus under a sea-sky background based on a broadband-hyperspectral infrared image fusion method, including:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a first image and a second image, and the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration;
the extraction module is used for extracting the spectral characteristics of the hyperspectral infrared image from the first image and extracting the infrared characteristics of the broadband infrared image from the second image;
and the first fusion module is used for fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized.
The embodiment of the invention has the following beneficial effects:
the image information can reflect the external quality characteristics of the sample such as size, shape, defects and the like; due to the fact that different components have different spectral absorption, spectral information can reflect differences of physical structures and chemical components in samples, and the hyperspectral infrared image integrates image information and spectral information of the samples, so that the hyperspectral infrared image has unique advantages in the aspects of quality detection, target identification and tracking. However, due to the influence of the extinction of the optical filter, the spatial resolution and the image signal-to-noise ratio of the hyperspectral infrared image are often low. However, images with high spatial resolution and high signal-to-noise ratio can be acquired using a broadband infrared detector.
The ship identification method under the sea and sky background based on the broadband-hyperspectral infrared image fusion method provided by the invention comprises the following steps: acquiring a first image and a second image, wherein the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration; extracting spectral features of a hyperspectral infrared image from the first image, and extracting infrared features of a broadband infrared image from the second image; and fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized. Therefore, the ship identification method under the sea and sky background based on the broadband-hyperspectral infrared image fusion method improves the limitation of single source data, improves the signal-to-noise ratio and the spatial resolution of the hyperspectral infrared image, and enriches the spectral characteristics of the broadband infrared image, so that the first identification result of the scene to be identified is obtained conveniently and accurately, and the technical problem that the existing ship detection technology cannot obtain a high-precision identification result quickly is solved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a ship identification method in a sea-sky background based on a broadband-hyperspectral infrared image fusion method according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for obtaining a first recognition result of a scene to be recognized by fusing a spectral feature and an infrared feature based on a D-S evidence reasoning method according to an embodiment of the present invention;
fig. 3 is a flowchart of another ship identification method based on a broadband-hyperspectral infrared image fusion method in a sea-sky background according to an embodiment of the present invention;
fig. 4 is a structural block diagram of a ship identification apparatus in a sea-sky background based on a broadband-hyperspectral infrared image fusion method according to a second embodiment of the present invention;
fig. 5 is a block diagram of another ship identification apparatus based on a broadband-hyperspectral infrared image fusion method in the marine background according to the second embodiment of the present invention.
Icon: 100-an acquisition module; 200-an extraction module; 300-a first fusion module; 400-a second fusion module; 500-an identification module; 600-confirmation module.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The existing ship detection technology and the high-resolution geosynchronous orbit remote sensing technology are still in the development stage, the detection and comparison process in the idea of combining single-frame detection and multi-frame confirmation is time-consuming, and therefore the ship detection technology for quickly obtaining a high-precision identification result is lacked in the prior art. Therefore, the object identification method and the object identification device provided by the embodiment of the invention can solve the technical problem that the existing ship detection technology cannot quickly obtain a high-precision identification result.
Example one
The ship identification method based on the broadband-hyperspectral infrared image fusion method in the embodiment of the invention is shown in figure 1 and comprises the following steps:
step S102, acquiring a first image and a second image, wherein the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration;
step S104, extracting the spectral characteristics of the hyperspectral infrared image from the first image, and extracting the infrared characteristics of the broadband infrared image from the second image;
and S106, fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized.
It should be noted that the first recognition result is a recognition result of a scene to be recognized, and the scene to be recognized is a scene including a ship in a sea-sky background.
In the embodiment of the invention, the ship identification method based on the broadband-hyperspectral infrared image fusion method under the sea-sky background obtains the first identification result of the scene to be identified based on the hyperspectral infrared image and the broadband infrared image after the scene to be identified is spatially registered, so that the limitation of single source data is improved, the signal-to-noise ratio and the spatial resolution of the hyperspectral infrared image are improved, the spectral characteristics of the broadband infrared image are enriched, the first identification result of the scene to be identified is conveniently and accurately obtained, and the technical problem that the existing ship detection technology cannot rapidly obtain a high-precision identification result is solved.
In an optional implementation manner of the embodiment of the present invention, as shown in fig. 2, in step S106, the fusion of the spectral feature and the infrared feature is performed based on a D-S evidence reasoning method to obtain a first recognition result of a scene to be recognized, where the method includes:
step S201, obtaining the basic probability assignment of the spectral features corresponding to the target identification objects by using a gray theory as a first evidence, and obtaining the basic probability assignment of the target identification objects of the infrared features by using the gray theory as a second evidence.
Specifically, a grey theory is used for obtaining basic probability assignment of scene feature vectors (feature vectors corresponding to spectral features/feature vectors corresponding to infrared features) to be recognized to each target recognition object, and the method comprises the following steps:
(1) determination of X0={X0(k) 1,2, n, namely a reference sequence, namely a feature vector of the scene to be identified; xi={Xi(k) I | (i ═ 1, 2., n } (i ═ 1, 2., m) is a comparison sequence, i.e., a feature vector of each target identifier in the database.
(2) Because each feature component in the feature vector may express different physical meanings, the feature vector of each target recognition object and the feature vector of the scene to be recognized need to be processed, and the dimension of the original data can be eliminated by adopting initial value transformation, as shown in the following formula:
Figure BDA0001662806230000074
(3) note deltai(k)=|X0(k)-Xi(k) L, called X0And XiOf the kth component, then X0(k) And Xi(k) Can be expressed as
Figure BDA0001662806230000071
Wherein the content of the first and second substances,
Figure BDA0001662806230000072
referred to as the two-stage minimum difference,
Figure BDA0001662806230000073
called the two-step maximum difference, ρ ∈ (0, ∞) is called the resolution factor. The smaller ρ is, the larger the resolving power is. The value range of the general rho is [0,1 ]]More generally, ρ is 0.5. It should be noted that the correlation coefficient reflects the matching degree of the kth characteristic attribute of the scene characteristic vector to be identified and the kth characteristic attribute of the ith target identifier characteristic vector.
(4) From this, X can be obtainediAnd X0Corresponding correlation coefficient, i.e. ξi={ξi(k)|k=1,2,...,n}。
(5) The correlation coefficient set is embodied on a value, called grey correlation, and is marked as gamma (X)0,Xi). Defining a gray level of
Figure BDA0001662806230000081
Wherein a (k) represents weight values with different correlation coefficients, k is 1,2, …, n, and
Figure BDA0001662806230000082
(6) finally, based on γiThe basic probability assignment is defined as:
Figure BDA0001662806230000083
wherein R isiEach target recognition object in the database is represented, and U represents indeterminate.
Thus, in the case where the scene feature vector to be identified is a feature vector corresponding to a spectral feature, RiRepresenting spectral characteristics corresponding to each target recognition object, m (R)i) And m (U) is first evidence; in the scene to be identified, the feature vector isIn the case of a feature vector corresponding to an infrared feature, RiRepresenting infrared features corresponding to respective target identifiers, m (R)i) And m (U) is the second evidence.
And step S202, calculating a basic probability assignment function and a confidence interval of each evidence after the first evidence and the second evidence are subjected to fusion processing by using a Dempster combination rule.
Specifically, the steps are as follows:
let Bel1And Bel2Are two trust functions, m, on the same recognition framework theta1And m2Respectively, the corresponding basic probability assignments (i.e. the first evidence and the second evidence), and the focal elements are respectively a1,A2,...,AkAnd B1,B2,...,BrAnd is also provided with
Figure BDA0001662806230000084
The basic probability assignment function of the evidence C after the first evidence and the second evidence are subjected to fusion processing is as follows:
Figure BDA0001662806230000091
it should be emphasized that, in the above formula, if K ≠ 1, m (c) is determined as a basic probability assignment; if K is 1, then m is considered to be1And m2Contradictory, the basic probability assignments cannot be combined.
It should be noted that:
(1) the basic probability assignment m (A) of the event A represents the support degree of the proposition A, wherein A is called a focal element;
(2) trust function:
Figure BDA0001662806230000092
the trust degree of proposition A is characterized, and the trust function can be understood as the total support degree of the evidence to proposition A or the degree of the proposition A reasonably believed by a decision maker under the evidence.
(3) Plausibility function:
Figure BDA0001662806230000093
the confidence level of the representation not negating A can also be understood as the support degree of evidence for proposition A.
Therefore, the trust function and the plausibility function summarize the relation of the evidence to a specific proposition A, and the interval [ Bel (A), Pl (A) ] is the trust degree interval of the focal element A, and the trust degree interval can well distinguish unknown and uncertain conditions.
And step S203, selecting the processed evidence with the maximum support degree from the processed evidences according to the basic probability assignment function and the confidence degree interval of the processed evidences, and determining the processed evidence with the maximum support degree as a first recognition result of the scene to be recognized.
Optionally, the processed evidence with the largest support degree may be selected from the processed evidences according to the basic probability assignment function of the processed evidences, and the processed evidence with the largest support degree may be determined as the first recognition result of the scene to be recognized. Hypothesis CiFor each evidence after treatment, the following steps are specifically adopted:
Figure BDA0001662806230000101
Figure BDA0001662806230000102
if there is
m(C1)-m(C2)>1
m(U)<2
m(C1)>m(U)
Then C is1I.e., the result of the decision, wherein,12is a preset threshold. This method is simple, and when the threshold value is ideal,has good decision-making result. Therefore, the method realizes the fusion of the characteristics extracted from a single broadband infrared image (which can be acquired by a broadband infrared sensor) and a single hyperspectral infrared image (hyperspectral infrared camera), eliminates the possible redundancy and contradiction between information, expands the complementation between information, improves the reliability of single-source target identification, and effectively identifies true and false targets.
In another optional implementation manner of the embodiment of the present invention, as shown in fig. 3, the method for identifying a ship in a sea-sky background based on the broadband-hyperspectral infrared image fusion method further includes:
s107, fusing the first image and the second image to obtain a fusion result;
step S108, identifying the scene to be identified according to the fusion result to obtain a second identification result;
and step S109, combining the first recognition result and the second recognition result to recognize and confirm the scene to be recognized.
In the embodiment of the invention, the second identification result is obtained by the data level fusion of the first image and the second image, and the scene to be identified is identified and confirmed by combining the first identification result and the second identification result, so that the identification precision of the scene to be identified is improved.
In another optional implementation manner of the embodiment of the present invention, in step S107, fusing the first image and the second image to obtain a fusion result, including:
performing histogram equalization stretching processing on the first image to convert the brightness value of the first image into an equalized brightness value to obtain a first equalized brightness value;
performing histogram equalization stretching processing on the second image to convert the brightness value of the second image into an equalized brightness value to obtain a second equalized brightness value;
and fusing the first equalized brightness value and the second equalized brightness value to obtain a fusion result.
Specifically, the calculation process of performing histogram equalization stretching on the image is as follows:
(1) the probability of the k-th level luminance value of the original image (first image/second image) is calculated:
Figure BDA0001662806230000111
in the formula: l denotes the number of original image brightness levels, nkRepresenting the number of picture elements where the k-th level of luminance of the original image appears, n representing the total number of picture elements of the original image, rkA luminance value, P, representing the k-th level of luminance of the original imager(rk) Representing the probability of the k-th level luminance value of the original image.
(2) The transformation function formula for converting the brightness value of the original image to the equalized brightness value is as follows:
Figure BDA0001662806230000112
and according to the transformation function T (r), the conversion from the brightness value r of the original image to the brightness value s after equalization is realized.
The embodiment of the invention realizes the data level fusion of the first image and the second image through the fusion of the equalized brightness values.
In another optional implementation manner of the embodiment of the present invention, the fusing the first equalized luminance value and the second equalized luminance value to obtain a fused result includes:
carrying out weighted summation on the first equalized brightness value and the second equalized brightness value to obtain a fused brightness value matrix, and determining an image represented by the fused brightness value matrix as a fused image;
and (5) performing histogram restoration on the fused image to obtain a fusion result.
Specifically, the first equalized luminance value and the second equalized luminance value are subjected to weighted summation to obtain a fused luminance value matrix, and the following formula can be adopted:
Rijk=SHEMijk+KkFMSHEHij
wherein R isijkAfter the representation fusionThe brightness value of the point of the k-band image (i, j) in the image(s) of (a), SHE represents the histogram equalization stretching process, MijkRepresenting k-band data, H, of hyperspectral infrared imagesijRepresenting broadband infrared image data, KkRepresenting k-band image adjustment data, and SHEMijkRepresenting a first equalized luminance value, SHEHijRepresenting second equalized luminance values, FM representing image median filtering
It should be noted that the participation of FM mainly reduces the influence of geometric mismatching between the hyperspectral infrared image data and the broadband infrared image data. If the registration between the two is very accurate, the FM operation can be omitted; otherwise, the threshold value of the median filtering can be expanded, namely, the matching between the hyperspectral infrared image data and the broadband infrared image data is adjusted by expanding the filtering window.
The purpose of restoring the histogram of the fused image is to restore the original multispectral characteristics of the fused image and improve the recognition accuracy, and it can be realized by, in detail, a histogram specification process of obtaining a new cumulative histogram function from each band histogram conversion function expression of the original image and performing the histogram specification process of the fused image based on the new cumulative histogram function.
Example two
The ship identification device under the sea and sky background based on the broadband-hyperspectral infrared image fusion method provided by the embodiment of the invention, as shown in fig. 4, comprises:
the system comprises an acquisition module 100, a processing module and a processing module, wherein the acquisition module is used for acquiring a first image and a second image, and the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration;
an extraction module 200, configured to extract a spectral feature of a hyperspectral infrared image from a first image, and extract an infrared feature of a broadband infrared image from a second image;
the first fusion module 300 is configured to fuse the spectral feature and the infrared feature based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized.
In the embodiment of the invention, the ship identification device based on the broadband-hyperspectral infrared image fusion method under the sea-sky background improves the limitation of single-source data, improves the signal-to-noise ratio and the spatial resolution of a hyperspectral infrared image, and enriches the spectral characteristics of the broadband infrared image, so that the first identification result of the scene to be identified can be conveniently and accurately obtained, the object identification method is applied to the field of ship detection, and the technical problem that the existing ship detection technology cannot quickly obtain a high-precision identification result is solved.
In an optional implementation manner of the embodiment of the present invention, the first fusion module is configured to:
obtaining basic probability assignment of the spectral features corresponding to the target identification objects by using a gray theory to serve as a first evidence, and obtaining basic probability assignment of the target identification objects of the infrared features by using the gray theory to serve as a second evidence;
calculating a basic probability assignment function and a confidence interval of each evidence after fusion processing of the first evidence and the second evidence by using a Dempster combination rule;
and selecting the processed evidence with the maximum support degree from the processed evidences according to the basic probability assignment function and the confidence degree interval of the processed evidences, and determining the processed evidence with the maximum support degree as a first recognition result of the scene to be recognized.
In another optional implementation manner of the embodiment of the present invention, as shown in fig. 5, the ship identification apparatus based on the broadband-hyperspectral infrared image fusion method in the marine background further includes:
a second fusion module 400, configured to fuse the first image and the second image to obtain a fusion result;
the recognition module 500 is configured to recognize the scene to be recognized according to the fusion result to obtain a second recognition result;
and the confirming module 600 is configured to perform identification confirmation on the scene to be identified by combining the first identification result and the second identification result.
In another optional implementation manner of the embodiment of the present invention, the second fusion module includes:
the first processing unit is used for performing histogram equalization stretching processing on the first image so as to convert the brightness value of the first image into an equalized brightness value and obtain a first equalized brightness value;
the second processing unit is used for performing histogram equalization stretching processing on the second image so as to convert the brightness value of the second image into an equalized brightness value and obtain a second equalized brightness value;
and the fusion unit is used for fusing the first equalized brightness value and the second equalized brightness value to obtain a fusion result.
In another optional implementation manner of the embodiment of the present invention, the fusion unit is configured to:
carrying out weighted summation on the first equalized brightness value and the second equalized brightness value to obtain a fused brightness value matrix, and determining an image represented by the fused brightness value matrix as a fused image;
and (5) performing histogram restoration on the fused image to obtain a fusion result.
The computer program product of the method and the device for identifying ships and warships in the sea and sky background based on the broadband-hyperspectral infrared image fusion method provided by the embodiment of the invention comprises a computer readable storage medium storing program codes, wherein instructions included in the program codes can be used for executing the method in the embodiment of the method, and specific implementation can be referred to the method embodiment, and is not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A ship identification method under a sea and sky background based on a broadband-hyperspectral infrared image fusion method is characterized by comprising the following steps:
acquiring a first image and a second image, wherein the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration;
extracting spectral features of the hyperspectral infrared image from the first image, and extracting infrared features of the broadband infrared image from the second image;
fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized;
fusing the first image and the second image to obtain a fusion result;
identifying the scene to be identified according to the fusion result to obtain a second identification result;
and combining the first recognition result and the second recognition result to recognize and confirm the scene to be recognized.
2. The method according to claim 1, wherein fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized, comprises:
obtaining basic probability assignments of the spectral features corresponding to the target identification objects by using a gray theory as a first evidence, and obtaining the basic probability assignments of the infrared features corresponding to the target identification objects by using the gray theory as a second evidence;
calculating a basic probability assignment function and a confidence interval of each evidence after the first evidence and the second evidence are subjected to fusion processing by using a Dempster combination rule;
and selecting the processed evidence with the maximum support degree from the processed evidences according to the basic probability assignment function and the confidence degree interval of the processed evidences, and determining the processed evidence with the maximum support degree as a first recognition result of the scene to be recognized.
3. The method of claim 1, wherein fusing the first image and the second image to obtain a fused result comprises:
performing histogram equalization stretching processing on the first image to convert the brightness value of the first image into an equalized brightness value to obtain a first equalized brightness value;
performing histogram equalization stretching processing on the second image to convert the brightness value of the second image into an equalized brightness value to obtain a second equalized brightness value;
and fusing the first equalized brightness value and the second equalized brightness value to obtain the fusion result.
4. The method according to claim 3, wherein fusing the first equalized luminance value and the second equalized luminance value to obtain the fused result comprises:
weighting and summing the first equalized brightness value and the second equalized brightness value to obtain a fused brightness value matrix, and determining an image represented by the fused brightness value matrix as a fused image;
and performing histogram restoration on the fused image to obtain the fusion result.
5. A ship identification device under a sea-sky background based on a broadband-hyperspectral infrared image fusion method is characterized by comprising the following steps:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a first image and a second image, and the first image and the second image are a hyperspectral infrared image and a broadband infrared image of a scene to be identified after spatial registration;
the extraction module is used for extracting the spectral characteristics of the hyperspectral infrared image from the first image and extracting the infrared characteristics of the broadband infrared image from the second image;
the first fusion module is used for fusing the spectral features and the infrared features based on a D-S evidence reasoning method to obtain a first recognition result of the scene to be recognized;
the second fusion module is used for fusing the first image and the second image to obtain a fusion result;
the recognition module is used for recognizing the scene to be recognized according to the fusion result to obtain a second recognition result;
and the confirmation module is used for carrying out identification confirmation on the scene to be identified by combining the first identification result and the second identification result.
6. The apparatus of claim 5, wherein the first fusion module is configured to:
obtaining basic probability assignments of the spectral features corresponding to the target identification objects by using a gray theory as a first evidence, and obtaining the basic probability assignments of the infrared features corresponding to the target identification objects by using the gray theory as a second evidence;
calculating a basic probability assignment function and a confidence interval of each evidence after the first evidence and the second evidence are subjected to fusion processing by using a Dempster combination rule;
and selecting the processed evidence with the maximum support degree from the processed evidences according to the basic probability assignment function and the confidence degree interval of the processed evidences, and determining the processed evidence with the maximum support degree as a first recognition result of the scene to be recognized.
7. The apparatus of claim 5, wherein the second fusion module comprises:
the first processing unit is used for performing histogram equalization stretching processing on the first image so as to convert the brightness value of the first image into an equalized brightness value and obtain a first equalized brightness value;
the second processing unit is used for performing histogram equalization stretching processing on the second image so as to convert the brightness value of the second image into an equalized brightness value and obtain a second equalized brightness value;
and the fusion unit is used for fusing the first equalized brightness value and the second equalized brightness value to obtain the fusion result.
8. The apparatus of claim 7, wherein the fusion unit is configured to:
weighting and summing the first equalized brightness value and the second equalized brightness value to obtain a fused brightness value matrix, and determining an image represented by the fused brightness value matrix as a fused image;
and performing histogram restoration on the fused image to obtain the fusion result.
CN201810471040.2A 2018-05-16 2018-05-16 Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method Active CN108681710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810471040.2A CN108681710B (en) 2018-05-16 2018-05-16 Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810471040.2A CN108681710B (en) 2018-05-16 2018-05-16 Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method

Publications (2)

Publication Number Publication Date
CN108681710A CN108681710A (en) 2018-10-19
CN108681710B true CN108681710B (en) 2020-11-27

Family

ID=63806744

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810471040.2A Active CN108681710B (en) 2018-05-16 2018-05-16 Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method

Country Status (1)

Country Link
CN (1) CN108681710B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101303724A (en) * 2007-05-10 2008-11-12 中国银联股份有限公司 Authentication authorization method and system
CN105427268A (en) * 2015-12-01 2016-03-23 中国航空工业集团公司洛阳电光设备研究所 Medium-long-wave dual-waveband infrared image feature level color fusion method
CN106056163A (en) * 2016-06-08 2016-10-26 重庆邮电大学 Multi-sensor information fusion object identification method
CN106778815A (en) * 2016-11-23 2017-05-31 河南工业大学 Wheat quality THz spectral classification methods based on DS evidence theories
CN107578432A (en) * 2017-08-16 2018-01-12 南京航空航天大学 Merge visible ray and the target identification method of infrared two band images target signature
CN107767406A (en) * 2017-11-13 2018-03-06 西北工业大学 A kind of multispectral image Dim target tracking method based on DS evidence theories

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7680330B2 (en) * 2003-11-14 2010-03-16 Fujifilm Corporation Methods and apparatus for object recognition using textons
FR2915301A1 (en) * 2007-04-20 2008-10-24 Groupe Ecoles Telecomm PROCESS FOR COMPARISON OF IMAGES OF A BIOMETRY BETWEEN A REFERENCE IMAGE AND AT LEAST ONE TEST IMAGE WHICH WE SEEK TO EVALUATE A DEGREE OF CORRELATION WITH THE REFERENCE IMAGE

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101303724A (en) * 2007-05-10 2008-11-12 中国银联股份有限公司 Authentication authorization method and system
CN105427268A (en) * 2015-12-01 2016-03-23 中国航空工业集团公司洛阳电光设备研究所 Medium-long-wave dual-waveband infrared image feature level color fusion method
CN106056163A (en) * 2016-06-08 2016-10-26 重庆邮电大学 Multi-sensor information fusion object identification method
CN106778815A (en) * 2016-11-23 2017-05-31 河南工业大学 Wheat quality THz spectral classification methods based on DS evidence theories
CN107578432A (en) * 2017-08-16 2018-01-12 南京航空航天大学 Merge visible ray and the target identification method of infrared two band images target signature
CN107767406A (en) * 2017-11-13 2018-03-06 西北工业大学 A kind of multispectral image Dim target tracking method based on DS evidence theories

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种基于D-S证据理论的红外小目标融合识别方法;李秋华 等;《系统工程与电子技术》;20021231;第24卷(第6期);第25-27页 *
基于D-S证据理论的遥感影像融合技术研究;刘江 等;《黑龙江工程学院学报》;20171231;第31卷(第6期);第6-10页 *
海面弱小目标高光谱融合技术研究;朱院院;《应用光学》;20170131;第38卷(第1期);第37-41页 *

Also Published As

Publication number Publication date
CN108681710A (en) 2018-10-19

Similar Documents

Publication Publication Date Title
CN110674688B (en) Face recognition model acquisition method, system and medium for video monitoring scene
CN111145131A (en) Infrared and visible light image fusion method based on multi-scale generation type countermeasure network
CN111598182B (en) Method, device, equipment and medium for training neural network and image recognition
CN111986240A (en) Drowning person detection method and system based on visible light and thermal imaging data fusion
CN113196289A (en) Human body action recognition method, human body action recognition system and device
CN110942458A (en) Temperature anomaly defect detection and positioning method and system
CN109492700B (en) Complex background target identification method based on multi-dimensional information fusion
WO2023284656A1 (en) Unmanned aerial vehicle detection method and system based on infrared polarization
CN110197185B (en) Method and system for monitoring space under bridge based on scale invariant feature transform algorithm
CN111160217A (en) Method and system for generating confrontation sample of pedestrian re-identification system
CN113128481A (en) Face living body detection method, device, equipment and storage medium
CN109274950B (en) Image processing method and device and electronic equipment
CN106960443B (en) Unsupervised change detection method and device based on full-polarization time sequence SAR image
JP2021068056A (en) On-road obstacle detecting device, on-road obstacle detecting method, and on-road obstacle detecting program
CN113936252A (en) Battery car intelligent management system and method based on video monitoring
CN114359149A (en) Dam bank dangerous case video detection method and system based on real-time image edge enhancement
CN108681710B (en) Ship identification method and device under sea-sky background based on broadband-hyperspectral infrared image fusion method
CN112949453A (en) Training method of smoke and fire detection model, smoke and fire detection method and smoke and fire detection equipment
CN111767868A (en) Face detection method and device, electronic equipment and storage medium
CN115393404A (en) Double-light image registration method, device and equipment and storage medium
CN114973368A (en) Face recognition method, device, equipment and storage medium based on feature fusion
CN114627424A (en) Gait recognition method and system based on visual angle transformation
Rao Implementation of Low Cost IoT Based Intruder Detection System by Face Recognition using Machine Learning
KR20220167824A (en) Defect detection system and method through image completion based on artificial intelligence-based denoising
Ganesh et al. Development of image fusion algorithm for impulse noise removal in digital images using the quality assessment in spatial domain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant