CN107274416A - High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure - Google Patents

High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure Download PDF

Info

Publication number
CN107274416A
CN107274416A CN201710442878.4A CN201710442878A CN107274416A CN 107274416 A CN107274416 A CN 107274416A CN 201710442878 A CN201710442878 A CN 201710442878A CN 107274416 A CN107274416 A CN 107274416A
Authority
CN
China
Prior art keywords
mrow
msub
image
spectrum
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710442878.4A
Other languages
Chinese (zh)
Other versions
CN107274416B (en
Inventor
魏巍
张艳宁
张磊
严杭琦
高凡
高一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201710442878.4A priority Critical patent/CN107274416B/en
Publication of CN107274416A publication Critical patent/CN107274416A/en
Application granted granted Critical
Publication of CN107274416B publication Critical patent/CN107274416B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/259Fusion by voting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of high spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure, the computationally intensive technical problem for solving existing high spectrum image conspicuousness object detection method.Technical scheme is to firstly generate spectrum gradient map;Regenerate image cut zone;Set up the conspicuousness detection model based on image level structure;Resettle the conspicuousness computational methods based on background priori and edge feature;Calculate notable figure result.Due to by calculating spectrum gradient in the spectrum of original high spectrum image dimension, the spectrum Gradient Features of image being extracted, to weaken the adverse effect that uneven illumination is brought.Use simple linear Iterative Clustering (Simple Linear Iterative Clustering, SLIC super-pixel) is generated, high spectrum image is carried out to split and speed-up computation process, its conspicuousness is weighed by calculating the spectral signature contrast between cut zone, amount of calculation is small.

Description

High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure
Technical field
It is more particularly to a kind of to be based on spectrum gradient the present invention relates to a kind of high spectrum image conspicuousness object detection method With the high spectrum image conspicuousness object detection method of hierarchical structure.
Background technology
High-spectrum seems to be recorded the spectral information of the various atural objects observed in visual field using imaging spectrometer Obtained image data.With the increasingly maturation of high light spectrum image-forming technology, imaging device is in its spectral resolution and spatial discrimination There is very big lifting in the indexs such as rate.So that the class such as the object detection mainly carried out originally on normal image, identification and tracking Topic is gradually able to extend on high-spectral data.Correlative study for high spectrum image conspicuousness target detection problems is still in Developing stage.
Existing high spectrum image conspicuousness object detection method mainly uses Itti models, and color characteristic is replaced with into height The spectral signature of spectrum picture, makes model be applied to high spectrum image.Document " S.L.Moan, A.Mansouri, et al., Saliency for Spectral Image Analysis[J].IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing,2013.6(6):P.2472-2479 one " is disclosed High spectrum image conspicuousness object detection method is planted, spectrum projection is utilized image to carry out by this method into CIELAB color spaces The modes such as principal component analysis (Principle Component Analysis, PCA) are utilized to spectral information.Existing method The base unit estimated using pixel as conspicuousness, passes through principal component analysis, Euclidean distance, spectrum angle (Spectral ) etc. Angle means assess the difference between different pixel spectrum, and the conspicuousness of each pixel is weighed out whereby.It is this by pixel The way of conspicuousness reflection full figure conspicuousness is difficult to avoid that in testing result object edge response is larger and internal response is very low Notable figure heterogeneity phenomenon.In addition, existing method all relies on single model, it is impossible to eliminate high spectrum image and carry out conspicuousness The difficult point mainly faced during detection research has, and the influence that the brightness change in image is caused to spectroscopic data, data scale is brought Huge amount of calculation, existing method all relies on single model.Therefore, it is badly in need of consolidating in the existing hyperspectral detection method of Gonna breakthrough There is thinking, propose new high spectrum image conspicuousness object detection method.
The content of the invention
In order to overcome the shortcomings of that existing high spectrum image conspicuousness object detection method is computationally intensive, the present invention provides a kind of High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure.This method firstly generates spectrum gradient Figure;Regenerate image cut zone;Then set up the conspicuousness detection model based on image level structure;Resettle based on background The conspicuousness computational methods of priori and edge feature;Finally calculate notable figure result.Due to by original high spectrum image Spectrum gradient is calculated in spectrum dimension, the spectrum Gradient Features of image are extracted, to weaken the adverse effect that uneven illumination is brought.Together When, generate super picture using simple linear Iterative Clustering (Simple Linear Iterative Clustering, SLIC) Element, carries out splitting and speed-up computation process to high spectrum image, is weighed by calculating the spectral signature contrast between cut zone Its conspicuousness is measured, amount of calculation is small.
The technical solution adopted for the present invention to solve the technical problems is:A kind of height based on spectrum gradient and hierarchical structure Spectrum picture conspicuousness object detection method, is characterized in comprising the following steps:
Step 1: generation spectrum gradient image.
Spectrum gradient is calculated to each pixel, spectrum gradient image, the spectrum gradient eigenvector extracted with toilet is generated Between maintain original image in spatial relationship.
In formula,It is j-th of component of spectrum gradient vector.It is vectorial j-th of the component of original spectrum.Δ λ is phase Adjacent wave section wavelength difference.
To the corresponding spectral vector of each pixel in a high-spectral data block D using formula (1) obtain one it is new Spectrum gradient data block X.
Step 2: generation image cut zone.
Simple linear Iterative Clustering is carried out to spectrum gradient data block X, comprised the following steps that:
Input:Spectrum gradient image X, expects super-pixel length of side s, weight coefficient m;
Output:Mark the segmentation figure picture of each super-pixel;
1. initialization procedure:
1) using s as gap length, one group of initial cluster center C is initialized on gradient image X;
2) each center is adjusted to the position where 3 × 3 neighborhood inside gradient minimum values;
3) the corresponding label l of each pixel is seti=-1, to its current affiliated center apart from di=+∞;
2. iteration updates pixel label, each cluster centre:
1) to current cluster centre Ck, it is that in the square neighborhoods of 2s, each pixel x in neighborhood is calculated by formula (2) in the length of sidei To CkApart from D (xi,Ck);
If 2) D (xi,Ck)<di, then x is putiCorresponding label li=k, and update di=D (xi,Ck);
3) repeat step 1), step 2) until between front and rear iteration twice the change at each center be less than threshold value;
In formula, dg(xi,Ck) it is xiWith CkThe Euclidean distance of middle spectrum gradient part.ds(xi,Ck) it is xiWith CkLocus Euclidean distance.M is the weight coefficient between two distances.
The density function in mean shift algorithm is replaced to complete the correlation computations during average drifting using double-core function, Its concrete form is
In formula, xgIt is the corresponding spectrum gradient vectors of pixel x.xsIt is the space coordinate where pixel x.TgIt is spectral signature Correspondence kernel function bandwidth.TsIt is space coordinate correspondence kernel function bandwidth.δ is normalization coefficient.
Mean shift algorithm is comprised the following steps that:
Input:Super-pixel center vector C={ C1,C2,…,Ck,…Cn, spectrum threshold Tg, capacity-threshold Ts
Output:To the label vector l at input super-pixel centersp
1) with super-pixel vector CkIt carries out average drifting process, note gained candidate centers C as initial centerj′;
2) to appearing in Cj' all the samples formed on path, count it to Cj' votes add 1;
3) current candidate centralization C ' is traveled through, is found and Cj' spectrum gradient distance be less than Tg/ 2, and space length is small In Ts/ 2 preferred center Ci′;
If 4)Then merge Ci′、Cj' on count of votes, while to C ' additions Ci′、Cj' average, and delete Ci′;It is no Then, go to step 5);
5) to each super-pixel center, repeat step 1) to 4), obtaining final each cluster centre;
6) to each super-pixel center, take and obtain the most cluster centre C ' of ballotmBelong to for it, obtain lsp
Step 3: setting up the conspicuousness target detection model based on image level structure.
To the spectrum threshold T of sample spectrum characteristic similarity requirement in step 2g, and control the sky of sample contiguous range Between threshold value Ts, difference value is 0.1,0.2,0.3,0.4 times of max { r, c }, and 10,20,25,30.So, the super picture of bottom Element will produce cluster result on 4 levels altogether after the cluster under different grain size, form one 4 layers of image level knot Structure.Super-pixel block entirety as bottom layer node is denoted asWhereinIllustrate super-pixel Number;In addition, j-th of cut zone in i-th layer is abstract for nodeAs described above, in the level knot that the number of plies is h Super-pixel is investigated under structureFinal significant result is gone up, then corresponding conspicuousness detection model is expressed as
In formula,It is to return to all in i-th layer includeNode subscript.ωiIt is nodeThe power of place level Weight.It is nodeOn conspicuousness numerical value.
Step 4: location-prior, background priori and edge feature conspicuousness computational methods.
1. location-prior and background priori.
The mathematic(al) representation of location-prior is as follows:
In formula,It is region RiMiddle pixel xkTo the Euclidean distance of image center.
The square ring region that all pixels are constituted within the pixel of selected distance image surrounding 10 is the borderline region R of imageb
For the node in image level structureFollowing three rule is followed when calculating its background priori size:
If 1)It is then rightThe penalty of application
2) otherwise, ifThen penalty
3)Common factor scaleBigger, punishment should be heavier, i.e.,Absolute value it is bigger;
The rule of the above three specify that to be calculated with contacting punishmentBoundary condition and influence factor during background priori. In the case of meeting rule, different circulars are obtained using various forms of penalties.The definition of penalty For
In formula, ξ is the penalty factor of each pixel institute band in borderline region.
2. edge feature conspicuousness.
It regard the average of each wave band of high spectrum image as its corresponding gray level image Ihsi, side is obtained using Canny detection Edge feature.Using the edge feature zoning conspicuousness on high spectrum image Spatial Dimension, step is as follows:
Input:High spectrum image average gray figure Ihsi, hierarchical structure nodeAnd its place segmentation result figure Iseg
Output:On edge feature conspicuousness
1) to IhsiEdge is extracted using Canny detection, result is obtained
2) to IsegIt is filtered with 3 × 3 variances for 1.5 Gaussian filter, makes zone boundary width increase;
3) to filtered IsegGradient magnitude image is sought, Boundary Extraction result is obtained through binaryzation
4) following formula is utilized by cut zoneBorderline image border is added up, and is obtained
In formula,It is cut zoneBorder;It isIn be located atThe cumulative fortune of interior edge feature Calculate.
Step 5: calculating notable figure.
When specifically calculating notable figure, cut zone on each node or each level is determined in hierarchical structureIt is aobvious Work property computational methods.By the contrast of spectrum gradient region, edge feature conspicuousness, location-prior and background priori four Individual part composition.Location-prior calculation formula is as follows:
When applying priori, only the part based on spectral signature regional correlation is strengthened, not to based on edge feature Part operated.Background priori is smaller due to the image boundary width of selection, has relatively good effect to suppressing background, therefore Computational methods based on two kinds of features are applied simultaneously.Finally giving conspicuousness calculation formula is
In formula,It is weight coefficient.
The beneficial effects of the invention are as follows:This method firstly generates spectrum gradient map;Regenerate image cut zone;Then build The conspicuousness detection model for the image level structure that is based on;Resettle the conspicuousness calculating side based on background priori and edge feature Method;Finally calculate notable figure result.Due to by calculating spectrum gradient in the spectrum of original high spectrum image dimension, extracting image Spectrum Gradient Features, to weaken the adverse effect that uneven illumination is brought.Meanwhile, use simple linear Iterative Clustering (Simple Linear Iterative Clustering, SLIC) generates super-pixel, and high spectrum image is split and added Fast calculation procedure, weighs its conspicuousness, amount of calculation is small by calculating the spectral signature contrast between cut zone.
The present invention is elaborated with reference to embodiment.
Embodiment
High spectrum image conspicuousness object detection method specific steps of the invention based on spectrum gradient and hierarchical structure are such as Under:
High-spectrum remote sensing is a cube structure, the space dimension corresponding pixel of reflection ground diverse location it is a certain too Reflectivity on sunlight wave band, the relation of pixel incident light and the reflected light on different-waveband of a certain position of spectrum dimension reflection. One panel height spectrum picture can be expressed as one p × n data acquisition system Xn={ x1,x2,...,xn, p is wave band number, and n is in image Pixel sum;A certain pixel can be expressed as x in imagei=(x1i,x2i,...,xpi)T, xpiIt is the reflection on p-th of wave band Rate.
Step 1: generation spectrum gradient map.
Spectrum gradient (Spectral Gradient) refer to difference along the vectorial each two adjacent component of original spectrum with it is corresponding The ratio of the difference of wavelength.And a series of vector being made up of spectrum gradients is referred to as spectrum gradient vector.We will be by every Result obtained from individual pixel calculates spectrum gradient is referred to as spectrum gradient image, the spectrum gradient eigenvector extracted with toilet it Between maintain original image in spatial relationship.
In formula,It is j-th of component of spectrum gradient vector.It is vectorial j-th of the component of original spectrum.Δ λ is phase Adjacent wave section wavelength difference.
To the corresponding spectral vector of each pixel in a high-spectral data block D using above-mentioned formula obtain one it is new Spectrum gradient data block X.Spectrum gradient can reduce because of luminance difference caused by uneven illumination to a certain extent, so that It can just weaken as this species diversity to the influence caused by subsequent algorithm step.
Step 2: generation image cut zone.
Because the conspicuousness of each pixel depends on uniqueness of its feature in neighborhood space (being usually 3x3 pixel coverages) Property, and the conspicuousness of this pixel scale is difficult to the conspicuousness for being effectively reflected corresponding macro object.The generation of super-pixel has Effect reduces the redundancy on regional area in image, simplifies image expression, and reduction pictures subsequent handles the complexity of task;Separately Outside, super-pixel is conducive to extracting image middle level information, more meets perceptive mode of the mankind to image.To spectrum gradient data X carries out simple linear Iterative Clustering (Simple Linear Iterative Clustering, SLIC), SLIC algorithms tool Body step is as follows:
Input:Spectrum gradient image X, expects super-pixel length of side s, weight coefficient m;
Output:Mark the segmentation figure picture of each super-pixel;
1. initialization procedure:
1) using s as gap length, one group of initial cluster center C is initialized on gradient image X;
2) each center is adjusted to the position where 3 × 3 neighborhood inside gradient minimum values;
3) the corresponding label l of each pixel is seti=-1, to its current affiliated center apart from di=+∞;
2. iteration updates pixel label, each cluster centre:
1) to current cluster centre Ck, it is that in the square neighborhoods of 2s, each pixel x in neighborhood is calculated by formula (2) in the length of sidei To CkApart from D (xi,Ck);
If 2) D (xi,Ck)<di, then x is putiCorresponding label li=k, and update di=D (xi,Ck);
3) repeat step 1), step 2) until between front and rear iteration twice the change at each center be less than threshold value;
In formula, dg(xi,Ck) it is xiWith CkThe Euclidean distance of middle spectrum gradient part.ds(xi,Ck) it is xiWith CkLocus Euclidean distance.M is the weight coefficient between two distances.
Spectrum gradient image can be divided into by above-mentioned SLIC super-pixel generating algorithm by many bulks smaller Cut section.It is very many that what super-pixel segmentation was obtained is that a kind of object in over-segmentation result, reality scene is often divided into Region in small, broken bits, is unfavorable for conspicuousness algorithm of target detection and more uniform result is obtained in interior of articles.So entering to super-pixel Row cluster, to obtain the visual signature on higher level, and improves the inside homogeneity of detection algorithm result.But, due to this Sample to be clustered is each super-pixel center in invention, wherein spectral signature and space coordinate are contained, therefore the present invention is used Double-core function replaces the density function in mean shift algorithm (Mean-Shift) to complete the correlation during average drifting Calculate, its concrete form is
In formula, xgIt is the corresponding spectrum gradient vectors of pixel x.xsIt is the space coordinate where pixel x.TgIt is spectral signature Correspondence kernel function bandwidth (Kernel Bandwidth).TsIt is space coordinate correspondence kernel function bandwidth.δ is normalization coefficient.
Mean-Shift algorithms are comprised the following steps that:
Input:Super-pixel center vector C={ C1,C2,…,Ck,…Cn, spectrum threshold Tg, capacity-threshold Ts
Output:To the label vector l at input super-pixel centersp
1) with super-pixel vector CkIt carries out average drifting process, note gained candidate centers C as initial centerj′;
2) to appearing in Cj' all the samples formed on path, count it to Cj' votes add 1;
3) current candidate centralization C ' is traveled through, is found and Cj' spectrum gradient distance be less than Tg/ 2, and space length is small In Ts/ 2 preferred center Ci′;
If 4)Then merge Ci′、Cj' on count of votes, while to C ' additions Ci′、Cj' average, and delete Ci′;It is no Then, go to step 5);
5) to each super-pixel center, repeat step 1) to 4), obtaining final each cluster centre;
6) to each super-pixel center, take and obtain the most cluster centre C ' of ballotmBelong to for it, obtain lsp
Step 3: setting up the conspicuousness target detection model based on image level structure.
Complete after above-mentioned steps, if only to considering that the spatial relationship between cut zone can not be very on single segmentation level Its contact between affiliated semantic region (Semantic Region) is described well, so that such as conspicuousness detects this The class processing task related to image, semantic understanding is difficult to obtain good result.In fact, multiple segmentation levels of image are total to With its hierarchical structure is constituted, effective utilization is carried out to it can change the utilization for strengthening processing method for image, semantic information.
To the spectrum threshold T of sample spectrum characteristic similarity requirement in previous stepg, and control sample contiguous range Capacity-threshold Ts, difference value is 0.1,0.2,0.3,0.4 times of max { r, c }, and 10,20,25,30.So, bottom is super Pixel will produce cluster result on 4 levels after the cluster under different grain size, altogether, form one 4 layers of image level Structure.Super-pixel block entirety as bottom layer node is denoted asWhereinIllustrate super-pixel Number;In addition, j-th of cut zone in i-th layer is abstract for nodeAs described above, in the level knot that the number of plies is h Super-pixel is investigated under structureFinal significant result is gone up, then corresponding conspicuousness detection model can be expressed as
In formula,It is to return to all in i-th layer includeNode subscript.ωiIt is nodeThe power of place level Weight.It is nodeOn conspicuousness numerical value.
Step 4: location-prior, background priori and edge feature conspicuousness computational methods.
The human eye of degree of concern in view of to(for) scene center is often higher than neighboring area, and invention introduces based on segmentation The priori of regional location and background priori.Background priori also assume that salient region often range image center closer to.So The region of near image boundaries is distributed in generally it is more likely that these borderline regions are applied punishment right to suppress its by background parts The response magnitude of detection method.The present invention is built using SLIC algorithms and Mean-Shift algorithms on a width spectrum gradient image Image level structure, and give the conspicuousness target detection model under the hierarchical structure.And newly introduce special based on edge The conspicuousness computational methods and background priori levied further improve descriptive power of the model for region significance.Next will To location-prior, background priori and edge feature conspicuousness computational methods are specifically introduced.
1. location-prior and background priori.
The human eye of degree of concern in view of to(for) scene center is often higher than neighboring area, the mathematic(al) representation of location-prior It is as follows:
In formula,It is region RiMiddle pixel xkTo the Euclidean distance of image center.
The square ring region that all pixels are constituted within the pixel of selected distance image surrounding 10 of the present invention is the frontier district of image Domain Rb.In order to pair and RbCut zone in the presence of common factor is suppressed, and we realize this using a kind of method of " contact is punished " One target.For the node in image level structureFollowing three rule should be followed when calculating its background priori size:
If 1)It is then rightThe penalty of application
2) otherwise, ifThen penalty
3)Common factor scaleBigger, punishment should be heavier, i.e.,Absolute value it is bigger;
The rule of the above three specify that to be calculated with contacting punishmentBoundary condition and influence factor during background priori. In the case of meeting rule, different circulars can be obtained using various forms of penalties.Penalty It is defined as
In formula, ξ is the penalty factor of each pixel institute band in borderline region.
2. edge feature conspicuousness.
Human visual system is more sensitive for image border, and visual attention is easily by the obvious image district of edge feature Domain is attracted, and this is primarily due to typically pixel grey scale where the edge of image and changes more violent place.So of the invention The regional correlation method based on spectrum Gradient Features is not only used, and also introduces the side on high spectrum image Spatial Dimension Edge feature further improves effect.The present invention regard the average of each wave band of high spectrum image as its corresponding gray level image Ihsi, Edge feature is obtained using Canny detection.The present invention utilizes the edge feature zoning on high spectrum image Spatial Dimension Conspicuousness, step is as follows:
Input:High spectrum image average gray figure Ihsi, hierarchical structure nodeAnd its place segmentation result figure Iseg
Output:On edge feature conspicuousness
1) to IhsiEdge is extracted using Canny detection, result is obtained
2) to IsegIt is filtered with 3 × 3 variances for 1.5 Gaussian filter, makes zone boundary width increase;
3) to filtered IsegGradient magnitude image is sought, Boundary Extraction result is obtained through binaryzation
4) following formula is utilized by cut zoneBorderline image border is added up, and is obtained
In formula,It is cut zoneBorder;It isIn be located atThe cumulative fortune of interior edge feature Calculate.
Step 5: calculating notable figure.
When specifically calculating notable figure, each node cut zone on each level in other words is mainly determined in hierarchical structureConspicuousness computational methods.As it was previously stated, in this chapter methodsMainly by the contrast of spectrum gradient region, edge Feature significance, four parts of location-prior and background priori are constituted.Location-prior calculation formula is as follows:
When applying priori, it is contemplated that location-prior is adjusted the distance with exponential function to be weighted enhancing to middle section and make It is not fully favourable to lifting detection method performance with stronger;Therefore only the part based on spectral signature regional correlation is carried out Enhancing, without being operated to the part based on edge feature.It is right and background priori is smaller due to the image boundary width of selection Suppressing background has relatively good effect, therefore the computational methods based on two kinds of features are applied simultaneously.Finally give the present invention Conspicuousness calculation formula be
In formula,It is weight coefficient.

Claims (1)

1. a kind of high spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure, it is characterised in that including Following steps:
Step 1: generation spectrum gradient image;
Spectrum gradient is calculated to each pixel, generated between spectrum gradient image, the spectrum gradient eigenvector extracted with toilet Maintain the spatial relationship in original image;
<mrow> <msubsup> <mi>g</mi> <mi>i</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>)</mo> </mrow> </msubsup> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>&amp;Delta;</mi> <mi>&amp;lambda;</mi> </mrow> </mfrac> <mo>{</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>)</mo> </mrow> </msubsup> <mo>-</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </msubsup> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
In formula,It is j-th of component of spectrum gradient vector;It is vectorial j-th of the component of original spectrum;Δ λ is adjacent wave Duan Bochang differences;
A new spectrum is obtained using formula (1) to the corresponding spectral vector of each pixel in a high-spectral data block D Gradient data block X;
Step 2: generation image cut zone;
Simple linear Iterative Clustering is carried out to spectrum gradient data block X, comprised the following steps that:
Input:Spectrum gradient image X, expects super-pixel length of side s, weight coefficient m;
Output:Mark the segmentation figure picture of each super-pixel;
1. initialization procedure:
1) using s as gap length, one group of initial cluster center C is initialized on gradient image X;
2) each center is adjusted to the position where 3 × 3 neighborhood inside gradient minimum values;
3) the corresponding label l of each pixel is seti=-1, to its current affiliated center apart from di=+∞;
2. iteration updates pixel label, each cluster centre:
1) to current cluster centre Ck, it is that in the square neighborhoods of 2s, each pixel x in neighborhood is calculated by formula (2) in the length of sideiTo Ck Apart from D (xi,Ck);
If 2) D (xi,Ck)<di, then x is putiCorresponding label li=k, and update di=D (xi,Ck);
3) repeat step 1), step 2) until between front and rear iteration twice the change at each center be less than threshold value;
<mrow> <mi>D</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>C</mi> <mi>k</mi> </msub> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <msqrt> <mrow> <msubsup> <mi>d</mi> <mi>g</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>C</mi> <mi>k</mi> </msub> </mrow> <mo>)</mo> </mrow> <mo>+</mo> <msup> <mrow> <mo>&amp;lsqb;</mo> <mfrac> <mrow> <msub> <mi>d</mi> <mi>s</mi> </msub> <mrow> <mo>(</mo> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>C</mi> <mi>k</mi> </msub> </mrow> <mo>)</mo> </mrow> </mrow> <mi>s</mi> </mfrac> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <msup> <mi>m</mi> <mn>2</mn> </msup> </mrow> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
In formula, dg(xi,Ck) it is xiWith CkThe Euclidean distance of middle spectrum gradient part;ds(xi,Ck) it is xiWith CkThe Europe of locus Formula distance;M is the weight coefficient between two distances;
The density function in mean shift algorithm is replaced to complete the correlation computations during average drifting using double-core function, it has Body form is
<mrow> <msub> <mi>K</mi> <mrow> <msub> <mi>T</mi> <mi>s</mi> </msub> <mo>,</mo> <msub> <mi>T</mi> <mi>g</mi> </msub> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mi>&amp;delta;</mi> <mrow> <msubsup> <mi>T</mi> <mi>s</mi> <mn>2</mn> </msubsup> <msubsup> <mi>T</mi> <mi>g</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> <mi>k</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <msub> <mi>x</mi> <mi>s</mi> </msub> <msub> <mi>T</mi> <mi>s</mi> </msub> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mi>k</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <msub> <mi>x</mi> <mi>g</mi> </msub> <msub> <mi>T</mi> <mi>g</mi> </msub> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
In formula, xgIt is the corresponding spectrum gradient vectors of pixel x;xsIt is the space coordinate where pixel x;TgIt is spectral signature correspondence Kernel function bandwidth;TsIt is space coordinate correspondence kernel function bandwidth;δ is normalization coefficient;
Mean shift algorithm is comprised the following steps that:
Input:Super-pixel center vector C={ C1,C2,…,Ck,…Cn, spectrum threshold Tg, capacity-threshold Ts
Output:To the label vector l at input super-pixel centersp
1) with super-pixel vector CkIt carries out average drifting process, note gained candidate centers C ' as initial centerj
2) to appearing in C 'jAll samples formed on path, count it to C 'jVotes add 1;
3) current candidate centralization C ' is traveled through, is found and C 'jSpectrum gradient distance be less than Tg/ 2, and space length is less than Ts/ 2 preferred center C 'i
If 4)Then merge C 'i、C′jOn count of votes, while to C ' additions C 'i、C′jAverage, and delete C 'i;Otherwise, turn Step 5);
5) to each super-pixel center, repeat step 1) to 4), obtaining final each cluster centre;
6) to each super-pixel center, take and obtain the most cluster centre C ' of ballotmBelong to for it, obtain lsp
Step 3: setting up the conspicuousness target detection model based on image level structure;
To the spectrum threshold T of sample spectrum characteristic similarity requirement in step 2g, and control the spatial threshold of sample contiguous range Value Ts, difference value is 0.1,0.2,0.3,0.4 times of max { r, c }, and 10,20,25,30;So, the super-pixel warp of bottom Cross after the cluster under different grain size, cluster result on 4 levels will be produced altogether, form one 4 layers of image level structure;Will It is denoted as the super-pixel block entirety of bottom layer nodeWhereinIllustrate the number of super-pixel; In addition, j-th of cut zone in i-th layer is abstract for nodeAs described above, being investigated in the case where the number of plies is h hierarchical structure Super-pixelFinal significant result is gone up, then corresponding conspicuousness detection model is expressed as
In formula,It is to return to all in i-th layer includeNode subscript;ωiIt is nodeThe weight of place level;It is nodeOn conspicuousness numerical value;
Step 4: location-prior, background priori and edge feature conspicuousness computational methods;
1. location-prior and background priori;
The mathematic(al) representation of location-prior is as follows:
<mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>R</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>&amp;omega;</mi> <mrow> <mo>(</mo> <msub> <mi>R</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>x</mi> <mi>k</mi> </msub> <mo>&amp;Element;</mo> <msub> <mi>R</mi> <mi>i</mi> </msub> </mrow> </munder> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mn>2</mn> <msubsup> <mi>d</mi> <msub> <mi>x</mi> <mi>k</mi> </msub> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
In formula,It is region RiMiddle pixel xkTo the Euclidean distance of image center;
The square ring region that all pixels are constituted within the pixel of selected distance image surrounding 10 is the borderline region R of imageb;For figure As the node in hierarchical structureFollowing three rule is followed when calculating its background priori size:
If 1)It is then rightThe penalty of application
2) otherwise, ifThen penalty
3)With RbCommon factor scaleBigger, punishment should be heavier, i.e.,Absolute value it is bigger;
The rule of the above three specify that to be calculated with contacting punishmentBoundary condition and influence factor during background priori;Meeting rule In the case of then, different circulars are obtained using various forms of penalties;The definition of penalty is
<mrow> <mi>&amp;kappa;</mi> <mrow> <mo>(</mo> <msub> <mover> <mi>N</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mi>&amp;xi;</mi> <mo>&amp;CenterDot;</mo> <mo>|</mo> <msub> <mover> <mi>N</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>&amp;cap;</mo> <msub> <mi>R</mi> <mi>b</mi> </msub> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow> 2
In formula, ξ is the penalty factor of each pixel institute band in borderline region;
2. edge feature conspicuousness;
It regard the average of each wave band of high spectrum image as its corresponding gray level image Ihsi, obtain edge using Canny detection special Levy;Using the edge feature zoning conspicuousness on high spectrum image Spatial Dimension, step is as follows:
Input:High spectrum image average gray figure Ihsi, hierarchical structure nodeAnd its place segmentation result figure Iseg
Output:On edge feature conspicuousness
1) to IhsiEdge is extracted using Canny detection, result is obtained
2) to IsegIt is filtered with 3 × 3 variances for 1.5 Gaussian filter, makes zone boundary width increase;
3) to filtered IsegGradient magnitude image is sought, Boundary Extraction result is obtained through binaryzation
4) following formula is utilized by cut zoneBorderline image border is added up, and is obtained
In formula,It is cut zoneBorder;It isIn be located atInterior edge feature accumulating operation;
Step 5: calculating notable figure;
When specifically calculating notable figure, cut zone on each node or each level is determined in hierarchical structureConspicuousness meter Calculation method;By the contrast of spectrum gradient region, edge feature conspicuousness, location-prior and four parts of background priori Composition;Location-prior calculation formula is as follows:
<mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mover> <mi>N</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mo>|</mo> <msub> <mover> <mi>N</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>|</mo> </mrow> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>x</mi> <mi>k</mi> </msub> <mo>&amp;Element;</mo> <msub> <mover> <mi>N</mi> <mo>^</mo> </mover> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> </mrow> </munder> <mi>exp</mi> <mo>{</mo> <mo>-</mo> <mn>2</mn> <msubsup> <mi>d</mi> <msub> <mi>x</mi> <mi>k</mi> </msub> <mn>2</mn> </msubsup> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
When applying priori, only the part based on spectral signature regional correlation is strengthened, not to the portion based on edge feature Divide and operated;Background priori is smaller due to the image boundary width of selection, has relatively good effect to suppressing background, therefore to base Applied simultaneously in the computational methods of two kinds of features;Finally giving conspicuousness calculation formula is
In formula,It is weight coefficient.
CN201710442878.4A 2017-06-13 2017-06-13 High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure Active CN107274416B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710442878.4A CN107274416B (en) 2017-06-13 2017-06-13 High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710442878.4A CN107274416B (en) 2017-06-13 2017-06-13 High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure

Publications (2)

Publication Number Publication Date
CN107274416A true CN107274416A (en) 2017-10-20
CN107274416B CN107274416B (en) 2019-11-01

Family

ID=60066918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710442878.4A Active CN107274416B (en) 2017-06-13 2017-06-13 High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure

Country Status (1)

Country Link
CN (1) CN107274416B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241854A (en) * 2018-01-02 2018-07-03 天津大学 A kind of deep video conspicuousness detection method based on movement and recall info
CN108416746A (en) * 2018-02-07 2018-08-17 西北大学 Based on high-spectrum image dimensionality reduction and the polychrome cultural relics pattern Enhancement Method that merges
CN108427931A (en) * 2018-03-21 2018-08-21 合肥工业大学 The detection method of barrier before a kind of mine locomotive based on machine vision
CN109063537A (en) * 2018-06-06 2018-12-21 北京理工大学 The high spectrum image preprocess method mixed for abnormal Small object solution
CN109191482A (en) * 2018-10-18 2019-01-11 北京理工大学 A kind of image combination and segmentation method based on region adaptivity spectral modeling threshold value
CN109559364A (en) * 2018-11-27 2019-04-02 东南大学 A kind of figure building method based on smoothness constraint
CN109829480A (en) * 2019-01-04 2019-05-31 广西大学 The method and system of the detection of body surface bloom feature and material classification
CN109975794A (en) * 2019-03-29 2019-07-05 江西理工大学 A method of intelligent manufacturing system detection and control are carried out using high light spectrum image-forming ranging model
CN111160300A (en) * 2019-12-31 2020-05-15 北京理工大学重庆创新中心 Deep learning hyperspectral image saliency detection algorithm combined with global prior
CN111832630A (en) * 2020-06-23 2020-10-27 成都恒创新星科技有限公司 Target detection method based on first-order gradient neural network
CN112070098A (en) * 2020-08-20 2020-12-11 西安理工大学 Hyperspectral image salient target detection method based on frequency adjustment model
WO2021027193A1 (en) * 2019-08-12 2021-02-18 佳都新太科技股份有限公司 Face clustering method and apparatus, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469392A (en) * 2015-11-18 2016-04-06 西北工业大学 High spectral image significance detection method based on regional spectrum gradient characteristic comparison
CN105913023A (en) * 2016-04-12 2016-08-31 西北工业大学 Cooperated detecting method for ice of The Yellow River based on multispectral image and SAR image
CN106503739A (en) * 2016-10-31 2017-03-15 中国地质大学(武汉) The target in hyperspectral remotely sensed image svm classifier method and system of combined spectral and textural characteristics

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469392A (en) * 2015-11-18 2016-04-06 西北工业大学 High spectral image significance detection method based on regional spectrum gradient characteristic comparison
CN105913023A (en) * 2016-04-12 2016-08-31 西北工业大学 Cooperated detecting method for ice of The Yellow River based on multispectral image and SAR image
CN106503739A (en) * 2016-10-31 2017-03-15 中国地质大学(武汉) The target in hyperspectral remotely sensed image svm classifier method and system of combined spectral and textural characteristics

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HANGQI YAN ET AL.: "SALIENT OBJECT DETECTION IN HYPERSPECTRAL IMAGERY USING SPECTRAL GRADIENT CONTRAST", 《IGARSS 2016》 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241854A (en) * 2018-01-02 2018-07-03 天津大学 A kind of deep video conspicuousness detection method based on movement and recall info
CN108241854B (en) * 2018-01-02 2021-11-09 天津大学 Depth video saliency detection method based on motion and memory information
CN108416746A (en) * 2018-02-07 2018-08-17 西北大学 Based on high-spectrum image dimensionality reduction and the polychrome cultural relics pattern Enhancement Method that merges
CN108416746B (en) * 2018-02-07 2023-04-18 西北大学 Colored drawing cultural relic pattern enhancement method based on dimension reduction and fusion of hyperspectral images
CN108427931A (en) * 2018-03-21 2018-08-21 合肥工业大学 The detection method of barrier before a kind of mine locomotive based on machine vision
CN108427931B (en) * 2018-03-21 2019-09-10 合肥工业大学 The detection method of barrier before a kind of mine locomotive based on machine vision
CN109063537B (en) * 2018-06-06 2021-08-17 北京理工大学 Hyperspectral image preprocessing method for unmixing of abnormal small target
CN109063537A (en) * 2018-06-06 2018-12-21 北京理工大学 The high spectrum image preprocess method mixed for abnormal Small object solution
CN109191482A (en) * 2018-10-18 2019-01-11 北京理工大学 A kind of image combination and segmentation method based on region adaptivity spectral modeling threshold value
CN109191482B (en) * 2018-10-18 2021-09-21 北京理工大学 Image merging and segmenting method based on regional adaptive spectral angle threshold
CN109559364A (en) * 2018-11-27 2019-04-02 东南大学 A kind of figure building method based on smoothness constraint
CN109559364B (en) * 2018-11-27 2023-05-30 东南大学 Graph construction method based on smoothness constraint
CN109829480A (en) * 2019-01-04 2019-05-31 广西大学 The method and system of the detection of body surface bloom feature and material classification
CN109975794A (en) * 2019-03-29 2019-07-05 江西理工大学 A method of intelligent manufacturing system detection and control are carried out using high light spectrum image-forming ranging model
CN109975794B (en) * 2019-03-29 2022-12-09 江西理工大学 Method for detecting and controlling intelligent manufacturing system by using hyperspectral imaging ranging model
WO2021027193A1 (en) * 2019-08-12 2021-02-18 佳都新太科技股份有限公司 Face clustering method and apparatus, device and storage medium
CN111160300A (en) * 2019-12-31 2020-05-15 北京理工大学重庆创新中心 Deep learning hyperspectral image saliency detection algorithm combined with global prior
CN111160300B (en) * 2019-12-31 2022-06-28 北京理工大学重庆创新中心 Deep learning hyperspectral image saliency detection algorithm combined with global prior
CN111832630A (en) * 2020-06-23 2020-10-27 成都恒创新星科技有限公司 Target detection method based on first-order gradient neural network
CN112070098A (en) * 2020-08-20 2020-12-11 西安理工大学 Hyperspectral image salient target detection method based on frequency adjustment model
CN112070098B (en) * 2020-08-20 2024-02-09 西安理工大学 Hyperspectral image salient target detection method based on frequency adjustment model

Also Published As

Publication number Publication date
CN107274416B (en) 2019-11-01

Similar Documents

Publication Publication Date Title
CN107274416A (en) High spectrum image conspicuousness object detection method based on spectrum gradient and hierarchical structure
Wang et al. Auto-AD: Autonomous hyperspectral anomaly detection network based on fully convolutional autoencoder
Zhang et al. Joint Deep Learning for land cover and land use classification
CN110135267B (en) Large-scene SAR image fine target detection method
EP3614308B1 (en) Joint deep learning for land cover and land use classification
Yin et al. Hot region selection based on selective search and modified fuzzy C-means in remote sensing images
Wang et al. Land cover change detection at subpixel resolution with a Hopfield neural network
CN109376611A (en) A kind of saliency detection method based on 3D convolutional neural networks
CN113505792B (en) Multi-scale semantic segmentation method and model for unbalanced remote sensing image
CN108389220B (en) Remote sensing video image motion target real-time intelligent cognitive method and its device
Wan et al. AFSar: An anchor-free SAR target detection algorithm based on multiscale enhancement representation learning
CN111311647B (en) Global-local and Kalman filtering-based target tracking method and device
CN107590515A (en) The hyperspectral image classification method of self-encoding encoder based on entropy rate super-pixel segmentation
CN112115961B (en) Hyperspectral remote sensing image classification method based on sparse graph regularization
CN103426158A (en) Method for detecting two-time-phase remote sensing image change
Wang et al. Spatiotemporal subpixel mapping of time-series images
CN115631344A (en) Target detection method based on feature adaptive aggregation
CN110334656A (en) Multi-source Remote Sensing Images Clean water withdraw method and device based on information source probability weight
CN111460966B (en) Hyperspectral remote sensing image classification method based on metric learning and neighbor enhancement
Wang et al. The PAN and MS image fusion algorithm based on adaptive guided filtering and gradient information regulation
Cao et al. Semi-supervised feature learning for disjoint hyperspectral imagery classification
Singh et al. A hybrid approach for information extraction from high resolution satellite imagery
CN116311387B (en) Cross-modal pedestrian re-identification method based on feature intersection
CN114049503A (en) Saliency region detection method based on non-end-to-end deep learning network
CN113989672A (en) SAR image ship detection method based on balance learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant