CN117152634B - Multi-source satellite image floating plant identification method and system based on chromaticity index - Google Patents

Multi-source satellite image floating plant identification method and system based on chromaticity index Download PDF

Info

Publication number
CN117152634B
CN117152634B CN202311170191.1A CN202311170191A CN117152634B CN 117152634 B CN117152634 B CN 117152634B CN 202311170191 A CN202311170191 A CN 202311170191A CN 117152634 B CN117152634 B CN 117152634B
Authority
CN
China
Prior art keywords
chromaticity
processed
calculating
distance
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311170191.1A
Other languages
Chinese (zh)
Other versions
CN117152634A (en
Inventor
秦雁
罗澍然
吉红香
郑泳
黄本胜
王晟力
彭力恒
蔡季宏
周晓鑫
徐张帆
杨楚旋
钟丽坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Research Institute of Water Resources and Hydropower
Original Assignee
Guangdong Research Institute of Water Resources and Hydropower
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Research Institute of Water Resources and Hydropower filed Critical Guangdong Research Institute of Water Resources and Hydropower
Priority to CN202311170191.1A priority Critical patent/CN117152634B/en
Publication of CN117152634A publication Critical patent/CN117152634A/en
Application granted granted Critical
Publication of CN117152634B publication Critical patent/CN117152634B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a multisource satellite image floating plant identification method based on chromaticity indexes, which comprises the following steps: acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image; calculating a first vegetation map based on a chromaticity index according to the satellite remote sensing image to be processed; acquiring a preset time period; calculating a water body template diagram according to the preset time period; calculating a second vegetation map according to the first vegetation map and the water body template map; and extracting a target vegetation map from the second vegetation map. The invention realizes the identification of the floating plants, reduces the labor cost and the influence of the threshold value, and improves the identification accuracy and efficiency. The invention can be widely applied to the technical field of floating plant identification.

Description

Multi-source satellite image floating plant identification method and system based on chromaticity index
Technical Field
The invention relates to the technical field of floating plant identification, in particular to a multisource satellite image floating plant identification method and system based on chromaticity indexes.
Background
The aquatic plants are aquatic plants, including water hyacinth, sophorae duckweed, etc., whose roots do not grow in the bottom mud, plant bodies float on the water surface and drift around along with water flow. Among them, water hyacinth is classified as one of ten invasive malignant organisms due to its extremely strong reproductive capacity. In general, the floating plants in the southern area are mainly water hyacinth. The physical salvage is an optimal and safe treatment means for the water hyacinth, and a reasonable salvage plan needs to comprehensively and timely master the distribution position of the water hyacinth, judge the growth source and the heavy disaster area of the water hyacinth, and accurately determine the salvage position, time and range. In the prior art, the labor cost of the floating plant identification method is high, the influence of a threshold value is high, the identification accuracy is low, and the efficiency is low.
Disclosure of Invention
The embodiment of the invention provides a multisource satellite image floating plant identification method based on a chromaticity index, which effectively reduces the influence of labor cost and a threshold value and improves the identification accuracy and efficiency.
In one aspect, the embodiment of the invention provides a multisource satellite image floating plant identification method based on a chromaticity index, which comprises the following steps:
Acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image;
calculating a first vegetation map based on a chromaticity index according to the satellite remote sensing image to be processed;
acquiring a preset time period;
Calculating a water body template diagram according to the preset time period;
calculating a second vegetation map according to the first vegetation map and the water body template map;
And extracting a target vegetation map from the second vegetation map.
In some embodiments, the calculating a water template map according to the preset time period includes:
acquiring a time sequence image set according to the preset time period, wherein the time sequence image set comprises a plurality of time treat each other for processing satellite remote sensing images;
Calculating a time sequence water body atlas according to the time sequence image set, wherein the time sequence water body atlas comprises a plurality of third vegetation figures;
calculating a water body pixel frequency chart according to the time sequence water body chart set;
Calculating a water body frequency chart according to the water body pixel frequency chart;
and calculating the water body template diagram according to the water body frequency diagram.
In some embodiments, the calculating a first vegetation map based on a chromaticity index according to the satellite remote sensing image to be processed includes:
Synthesizing a target color image according to the satellite remote sensing image to be processed;
calculating dominant wavelength and purity based on chromaticity index according to the target color image;
And extracting the first vegetation graph from the target color image according to the dominant wavelength and the purity.
In some embodiments, the synthesizing the target color image according to the satellite remote sensing image to be processed includes:
preprocessing the satellite remote sensing image to be processed to obtain a target satellite remote sensing image;
synthesizing a color image to be processed according to the target satellite remote sensing image, wherein the color image to be processed comprises a pseudo-color image, a visual true color image or a standard pseudo-color image;
and carrying out image stretching on the color image to be processed to obtain a target color image.
In some embodiments, the chromaticity index includes a target stimulus value, chromaticity coordinates to be processed, or chromaticity point distance to be processed, the calculating dominant wavelength and purity from the target color image includes:
Calculating the target stimulus value according to the target color image, wherein the target stimulus value comprises a first stimulus value, a second stimulus value or a third stimulus value;
calculating the chromaticity coordinates to be processed according to the target stimulus value;
Calculating the distance and purity of the chromaticity point to be processed according to the chromaticity coordinate to be processed;
and calculating the dominant wavelength according to the distance between the chromaticity points to be processed.
In some embodiments, the calculating the chromaticity coordinates to be processed according to the target stimulus value includes:
calculating a first normalized value according to the target stimulus value;
calculating a second normalized value according to the target stimulus value;
And calculating the chromaticity coordinates to be processed according to the first normalized value and the second normalized value.
In some embodiments, the chromaticity coordinates to be processed include spectral locus chromaticity coordinates, iso-energy white light point chromaticity coordinates, or first chromaticity coordinates, and the calculating the chromaticity point distance to be processed according to the chromaticity coordinates to be processed includes:
Calculating a slope according to the chromaticity coordinates of the iso-energy white light point and the first chromaticity coordinates, wherein the calculation formula of the slope is as follows:
Wherein k is the slope, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
Calculating a to-be-processed chromaticity point distance according to the slope, the spectrum locus chromaticity coordinate, the equivalent white light point chromaticity coordinate and the first chromaticity coordinate, wherein a calculation formula of the to-be-processed chromaticity point distance is as follows:
Wherein d is the distance between the chromaticity points to be processed, k is the slope, x λ is the abscissa of the chromaticity coordinates of the spectrum locus, y c is the ordinate of the chromaticity coordinates of the first chromaticity, and y s is the ordinate of the chromaticity coordinates of the iso-energy white light point.
In some embodiments, the to-be-processed chromaticity point distance includes a first chromaticity point distance or a second chromaticity point distance, and the calculating the dominant wavelength according to the to-be-processed chromaticity point distance includes:
acquiring a first wavelength corresponding to the first chromaticity point distance and a second wavelength corresponding to the second chromaticity point distance;
calculating a dominant wavelength according to the first chromaticity point distance, the first wavelength, the second chromaticity point distance and the second wavelength, wherein the dominant wavelength has the following calculation formula:
Where λ d is the dominant wavelength, d 1 is the first chromaticity point distance, λ 1 is the first wavelength, d 2 is the second chromaticity point distance, and λ 2 is the second wavelength.
In some embodiments, the calculating the purity from the chromaticity coordinates to be processed includes:
And calculating a first distance according to the chromaticity coordinates of the spectrum locus and the chromaticity coordinates of the equivalent white light point, wherein the calculation formula of the first distance is as follows:
Wherein SD is the first distance, x λ is the abscissa of the chromaticity coordinate of the spectrum trace, y λ is the ordinate of the chromaticity coordinate of the spectrum trace, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
and calculating a second distance according to the first chromaticity coordinate and the chromaticity coordinate of the equivalent white light point, wherein the calculation formula of the second distance is as follows:
Wherein SC is the second distance, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
and calculating purity according to the first distance and the second distance, wherein the calculation formula of the purity is as follows:
P=SC/SD
Wherein P is the purity, SC is the second distance, and SD is the first distance.
In another aspect, an embodiment of the present invention provides a multi-source satellite image floating plant identification system based on a chromaticity index, including:
The first module is used for acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image;
The second module is used for calculating a first vegetation graph based on a chromaticity index according to the satellite remote sensing image to be processed;
a third module, configured to obtain a preset time period;
A fourth module, configured to calculate a water template map according to the preset time period;
a fifth module for calculating a second vegetation map according to the first vegetation map and the water template map;
and a sixth module, configured to extract a target vegetation map from the second vegetation map.
The invention has the following beneficial effects:
According to the invention, the satellite remote sensing image to be processed is firstly obtained, the first vegetation image is calculated, then the preset time period is obtained, the water body template image is calculated, the second vegetation image is calculated according to the first vegetation image and the water body template image, and finally the target vegetation image is extracted from the second vegetation image, so that the identification of the floating plants is realized, the influence of the labor cost and the threshold value is reduced, and the identification accuracy and efficiency are improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for identifying a multi-source satellite image floating plant based on a chromaticity index according to an embodiment of the present invention;
FIG. 2 is a pseudo color image of a target after image stretching of a synthetic aperture radar satellite remote sensing image according to an embodiment of the present invention;
FIG. 3 is a CIE-xy two-dimensional chromaticity diagram illustrating an embodiment of the invention;
FIG. 4 is a graph showing the result of calculating the purity from the target pseudo-color image according to the embodiment of the present invention;
FIG. 5 is a block diagram of CIE-xy two-dimensional chromaticity coordinates according to an embodiment of the invention;
FIG. 6 is a graph showing the result of calculating dominant wavelength from a target pseudo-color image according to an embodiment of the present invention;
FIG. 7 is a flowchart of calculating a water template map according to a preset time period according to an embodiment of the present invention;
FIG. 8 is a water template diagram obtained by calculation according to a time sequence image set obtained in a preset time period according to an embodiment of the present invention;
FIG. 9 is a diagram of a target floating vegetation image extracted from a target pseudo-color image according to an embodiment of the present invention;
Fig. 10 is a view of a target floating vegetation map extracted from a target apparent color image according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings and specific embodiments. The step numbers in the following embodiments are set for convenience of illustration only, and the order between the steps is not limited in any way, and the execution order of the steps in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
In the description of the present invention, the meaning of a number is one or more, the meaning of a number is two or more, and greater than, less than, exceeding, etc. are understood to exclude the present number, and the meaning of a number is understood to include the present number. The description of the first and second is for the purpose of distinguishing between technical features only and should not be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the embodiments of the invention is for the purpose of describing embodiments of the invention only and is not intended to be limiting of the invention.
As shown in fig. 1, the embodiment of the invention provides a multi-source satellite image floating plant identification method based on a chromaticity index, which can be applied to a background processor, a server or cloud equipment corresponding to satellite remote sensing image floating plant identification software. During application, the method of the present embodiment includes, but is not limited to, the following steps:
Step S11, acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image.
In this embodiment, ground data may be collected by a synthetic aperture radar loaded on a satellite, to obtain a synthetic aperture radar satellite remote sensing image. The synthetic aperture radar is a high-resolution imaging radar, and can obtain a high-resolution radar image similar to an optical photograph under weather conditions with extremely low visibility. The ground data can be acquired through an optical acquisition device loaded on the satellite, so that an optical satellite remote sensing image can be obtained.
And step S12, calculating a first vegetation graph based on the chromaticity index according to the satellite remote sensing image to be processed.
In this embodiment, the specific implementation procedure of step S12 includes, but is not limited to, step S401 to step S403:
and S401, synthesizing a target color image according to the satellite remote sensing image to be processed.
In this embodiment, when the satellite remote sensing image to be processed is a synthetic aperture radar satellite remote sensing image, the execution process of step S401 is as follows:
And preprocessing the satellite remote sensing image to be processed to obtain a target satellite remote sensing image.
In this embodiment, the obtained synthetic aperture radar satellite remote sensing image is preprocessed, and orbit correction, thermal noise removal, radiometric calibration (conversion into a backscatter coefficient), speckle filtering (improved Lee filter), terrain correction and backscatter coefficient decibelization are sequentially performed, so as to obtain the target synthetic aperture radar satellite remote sensing image. The calculation formula of the backscattering coefficient decibelization is as follows:
σ0(dB)=10×log10σ0
Where σ 0 (dB) is the backscattering coefficient after decibelization, and σ 0 is the backscattering coefficient.
Synthesizing a color image to be processed according to the target satellite remote sensing image; wherein the color image to be processed may include, but is not limited to, a pseudo-color image, a visual true color image, or a standard pseudo-color image.
In this embodiment, according to the target synthetic aperture radar satellite remote sensing image, different polarized wave bands can be obtained through dual polarization or full polarization calculation, and then pseudo-color images are synthesized according to different wave band sequences. The present embodiment employs dual polarization to calculate the σ VV (dB), σ VH (dB) and σ VV-VH (dB) bands, wherein the calculation formula of the σ VV-VH (dB) band is as follows:
σVV-VH(dB)=σVV(dB)-σVH(dB)
Where σ VV-VH (dB) is the polarization band difference, σ VV (dB) is the vertical-transmission vertical-reception single-polarization band, and σ VH (dB) is the vertical-transmission horizontal-reception dual-polarization band.
In this embodiment, according to the calculated wavelength bands, RGB channels are synthesized into a pseudo color image in the order of σ VV(dB)、σVH(dB)、σVV-VH (dB) wavelength bands as a color image to be processed.
And performing image stretching on the color image to be processed to obtain a target color image.
In this embodiment, a clipping linear stretching method is adopted to perform image stretching processing on a pseudo-color image synthesized by a synthetic aperture radar satellite remote sensing image, pixel values with a histogram accumulated between 5% and 95% are stretched, and an 8-bit target pseudo-color image with a value range between 0 and 255 is output, and an image stretching result is shown in fig. 2.
In this embodiment, when the satellite remote sensing image to be processed is an optical satellite remote sensing image, the execution process of step S401 is as follows:
And preprocessing the satellite remote sensing image to be processed to obtain a target satellite remote sensing image.
In this embodiment, preprocessing is performed on the obtained optical satellite remote sensing image, and orthorectification, radiometric calibration, cloud removal processing and data fusion preprocessing are sequentially performed to obtain a target optical satellite remote sensing image.
And synthesizing a color image to be processed according to the target satellite remote sensing image.
In this embodiment, the color image to be processed includes a pseudo color image, a true-looking color image or a standard pseudo color image, and the R R red band, the R G green band, the R B blue band, the R NIR near infrared band, and the R SWIR short wave infrared band can be obtained by performing reflectivity analysis on the target optical satellite remote sensing image. According to the obtained wave bands, RGB channels are synthesized into a visual true color image according to the R SWIR、RNIR、RR wave band sequence or synthesized into a standard false color image according to the R NIR、RR、RG wave band sequence.
And performing image stretching on the color image to be processed to obtain a target color image.
In the embodiment, a clipping linear stretching method is adopted to carry out image stretching treatment on a visual true color image or a standard false color image synthesized by an optical satellite remote sensing image, pixel values with a histogram accumulated between 5% and 95% are stretched, and 8-bit target visual true color images or target standard false color images with a value range between 0 and 255 are output.
Step S402, calculating dominant wavelength and purity based on chromaticity index according to the target color image.
In this embodiment, the chromaticity index includes a target stimulus value, a chromaticity coordinate to be processed, or a chromaticity point distance to be processed, and the specific implementation process of step S402 includes, but is not limited to, steps S201-S204:
step S201, calculating a target stimulus value according to the target color image, wherein the target stimulus value comprises a first stimulus value, a second stimulus value or a third stimulus value.
In this embodiment, three primary colors of the target color image are acquired, and a first stimulus value, a second stimulus value, and a third stimulus value are calculated, respectively. The calculation formula of the target stimulus value is as follows:
X=2.7689R+1.7517G+1.1302B
Y=1.0000R+4.5907G+0.0601B
Z=0.0000R+0.0565G+5.5934B
wherein X is a first stimulus value, Y is a second stimulus value, Z is a third stimulus value, R is red in the three primary colors, G is green in the three primary colors, and B is blue in the three primary colors.
Step S202, calculating chromaticity coordinates to be processed according to the target stimulus value.
In this embodiment, the first normalized value and the second normalized value are calculated from the first stimulus value, the second stimulus value, and the third stimulus value, respectively. The calculation formulas of the first normalized value and the second normalized value are as follows:
Wherein X is a first normalized value, X is a first stimulus value, Y is a second stimulus value, and Z is a third stimulus value.
Wherein Y is a second normalized value, X is a first stimulus value, Y is a second stimulus value, and Z is a third stimulus value.
Wherein Z is a third normalized value, X is a first stimulus value, Y is a second stimulus value, and Z is a third stimulus value.
x+y+z=1
Wherein x is a first normalized value, y is a second normalized value, and z is a third normalized value.
In this embodiment, (x, y) is taken as the chromaticity coordinates to be processed according to the first normalized value and the second normalized value, and the third normalized value is used to limit the chromaticity coordinates to be processed. In this embodiment, the chromaticity point required in the target color image is calculated as chromaticity coordinates to be processed to form a chromaticity diagram, as shown in fig. 3, the calculated chromaticity coordinates to be processed are all located in a CIE-xy two-dimensional chromaticity diagram, the CIE-xy two-dimensional chromaticity diagram is used for representing multiple colors in the visible light range, each color corresponds to one chromaticity coordinate (x, y), and each chromaticity coordinate falls in a range surrounded by a horseshoe-shaped spectrum locus.
And step S203, calculating the distance and purity of the chromaticity point to be processed according to the chromaticity coordinates to be processed.
In this embodiment, the chromaticity coordinates to be processed include the chromaticity coordinates of the spectrum locus, the chromaticity coordinates of the iso-energy white light point, or the first chromaticity coordinates. In this embodiment, calculating the distance between the chromaticity points to be processed according to the chromaticity coordinates to be processed includes:
and calculating the slope according to the chromaticity coordinates and the first chromaticity coordinates of the equivalent white light point. Wherein, the calculation formula of the slope is as follows:
Where k is the slope, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point.
In this embodiment, the S point in fig. 3 is taken as the isoelectric white light point, the coordinates thereof are taken as chromaticity coordinates of the isoelectric white light point, and the coordinate values areIndicating that the three primary colors are equally mixed. Meanwhile, any point in the spectrum locus range may be used as the first chromaticity point, and in this embodiment, the point C in fig. 3 is used as the first chromaticity point, and the coordinate thereof may be (x c,yc). Based on the calculation formula of the slope, the slope of the straight line formed by connecting the S point and the C point can be obtained.
And calculating the distance between the chromaticity points to be processed according to the slope, the chromaticity coordinates of the spectrum locus, the chromaticity coordinates of the equivalent white light points and the first chromaticity coordinates. The calculation formula of the distance between the chromaticity points to be processed is as follows:
Where d is the distance between the chromaticity points to be processed, k is the slope, x λ is the abscissa of the chromaticity coordinates of the spectrum locus, y c is the ordinate of the chromaticity coordinates of the first chromaticity, and y s is the ordinate of the chromaticity coordinates of the iso-energy white light point.
In this embodiment, any point on the spectrum locus may be selected as a spectrum locus chromaticity point, and its coordinate is the spectrum locus chromaticity coordinate, and its coordinate value is (x λ,yλ). It will be appreciated that the first stimulus value, the second stimulus value and the third stimulus value of the spectral locus in the CIE-XYZ system are all known, and therefore the chromaticity coordinates (x λ,yλ) of any point on the spectral locus are also known. Based on the calculation formula of the distance between the chromaticity points to be processed, the distance between any point (x λ,yλ) on the spectrum locus and a straight line formed by connecting the S point and the C point can be calculated. For example, the D point on the spectrum locus may be used as a spectrum locus chromaticity point, the coordinates thereof may be used as spectrum locus chromaticity coordinates, and the distance from the D point to the straight line SC may be calculated by the above calculation formula of the chromaticity point distance to be processed.
In this embodiment, calculating the purity from the chromaticity coordinates to be processed includes:
according to the chromaticity coordinates of the spectrum locus, the chromaticity coordinates of the equivalent white light point and the first chromaticity coordinates, a first distance and a second distance are calculated respectively, and the calculation formulas of the first distance and the second distance are as follows:
Wherein SD is a first distance, x λ is an abscissa of chromaticity coordinates of the spectrum locus, y λ is an ordinate of chromaticity coordinates of the spectrum locus, x s is an abscissa of chromaticity coordinates of the equivalent white light point, and y s is an ordinate of chromaticity coordinates of the equivalent white light point;
Where SC is the second distance, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
according to the first distance and the second distance, the purity is calculated, and the calculation formula of the purity is as follows:
P=SC/SD
Wherein P is the purity, SC is the second distance, and SD is the first distance.
In this embodiment, the distance from the S point to the D point in fig. 3 may be calculated by a calculation formula of the first distance, and the distance from the S point to the C point in fig. 3 may be calculated by a calculation formula of the second distance. Based on a calculation formula of purity, the purity can be calculated by comparing the distance from the S point to the C point with the distance from the S point to the D point, and the calculation result is shown in fig. 4.
Step S204, calculating the dominant wavelength according to the distance between the chromaticity points to be processed.
In this embodiment, the distance between the to-be-processed chromaticity points includes a first chromaticity point distance or a second chromaticity point distance, and calculating the dominant wavelength according to the to-be-processed chromaticity point distance includes:
Obtaining a first wavelength corresponding to the first chromaticity point distance and a second wavelength corresponding to the second chromaticity point distance, and calculating a dominant wavelength according to the first chromaticity point distance, the first wavelength, the second chromaticity point distance and the second wavelength, wherein the dominant wavelength is calculated according to the following calculation formula:
Where λ d is the dominant wavelength, d 1 is the first chromaticity point distance, λ 1 is the first wavelength, d 2 is the second chromaticity point distance, and λ 2 is the second wavelength.
In this embodiment, the entire chromaticity space may be divided into four regions according to the CIE-xy chromaticity diagram, and the region division results are shown in fig. 5. It will be appreciated that in either region I or III, when the chromaticity point distance is zero, there are two points of intersection between the line and the spectrum locus where point E is connected to any chromaticity point, each chromaticity point having both a dominant wavelength and a complementary wavelength. When the chromaticity point is in the area I, the intersection point of the straight line formed by the chromaticity point and the point E and the spectrum locus in the area I is the dominant wavelength of the chromaticity point, the intersection point of the spectrum locus in the area III is the complementary wavelength of the chromaticity point, and when the chromaticity point is in the area III, the opposite is the intersection point. In either region II or region IV, there is only one intersection of the straight line and the spectral locus that the point E joins with either chromaticity point. When the chromaticity point is in the zone II, the intersection point of the straight line formed by the chromaticity point and the point E and the spectrum track in the zone II is the dominant wavelength of the chromaticity point, and when the chromaticity point is in the zone IV, the intersection point is the complementary wavelength.
Illustratively, in fig. 3, the wavelength corresponding to the intersection point D of the spectrum locus and the ray directed from the S point to the C point is the dominant wavelength of the C point color, and the wavelength corresponding to the intersection point M of the spectrum locus and the ray directed from the C point to the S point is the complementary wavelength of the C point color. It will be appreciated that the point on each ray directed from an isoelectric white light point to a spectral locus has the same dominant or complementary wavelength, which is an important indicator of color quantification, identifying visible light as different colors in the range of 380nm to 700nm at 1nm intervals, and is capable of representing the hue of one color in a specific wavelength form. When the distance between the chromaticity points is zero, the dominant wavelength or complementary wavelength of the chromaticity points with the accuracy of 1nm can be determined.
In this embodiment, the chromaticity coordinate values of the spectrum locus provided by the CIE-xy chromaticity diagram are 1nm as intervals, and when the value of the chromaticity point distance cannot be zero, the dominant wavelength of the chromaticity point can be obtained by calculation through the above formula. In fig. 5, when the chromaticity coordinate value of the point H is not on the integer value with 1nm as the interval, two integer values closest to the point H are selected as the auxiliary calculation points, namely, the point J and the point K, the distance from the point J to the straight line EG is the first chromaticity point distance, the wavelength corresponding to the point J is the first wavelength, the distance from the point K to the straight line EG is the second chromaticity point distance, the wavelength corresponding to the point K is the second wavelength, the dominant wavelength of the chromaticity point G can be calculated by the above-mentioned dominant wavelength calculation formula, and the calculation result is shown in fig. 6.
Step S403, extracting a first vegetation graph from the target color image according to the dominant wavelength and purity.
In this embodiment, according to the dominant wavelength and purity, according to the colors of vegetation in the target pseudo-color image, the target apparent-color image or the target standard pseudo-color image, the pixel point with the dominant wavelength greater than 550nm is selected from the target color image, and the pixel value is marked as 1, and the pixel value is marked as 0 for the pixel point which does not meet the condition, so as to extract the binary vegetation map as the first vegetation map.
Step S13, acquiring a preset time period.
In this embodiment, the preset time period is a period in which imaging time of the remote sensing image of the target satellite is similar, and the remote sensing image observed by satellite revisit in the preset time period needs to include at least 5 periods. The optical remote sensing image water surface needs no cloud and no cloud shadow coverage, and can be counted as stage 1. For example, the preset period of time may be set to period 5.
And S14, calculating a water body template chart according to a preset time period.
In this embodiment, as shown in fig. 7, the implementation procedure of step S14 may include, but is not limited to, step S301 to step S305:
step S301, acquiring a time sequence image set according to a preset time period.
In this embodiment, the time series image set may include a plurality of time treat each other for processing satellite remote sensing images. The satellite remote sensing image to be processed and the first vegetation map are in the same area range when a plurality of time treat each other near the acquisition time point of the first vegetation map can be acquired in a preset time period to be used as a time sequence image set. It can be understood that, in step S12, the first vegetation map may be calculated according to the satellite remote sensing image to be processed, where the acquisition time point of the first vegetation map is the acquisition time point of the satellite remote sensing image to be processed in step S12. For example, 5 expected processed satellite remote sensing images proximate to the point in time of the first vegetation map may be acquired as a time series image set.
Step S302, calculating a time sequence water body atlas according to the time sequence image set.
In this embodiment, the time series water body atlas may include a plurality of third vegetation graphs. For example, the satellite remote sensing image to be processed in the time sequence image set may be used as the satellite remote sensing image to be processed in the step S12, the first vegetation map in the step S12 is calculated to be used as the third vegetation map through the execution step of the step S12, and the plurality of third vegetation maps are calculated for each time-phase satellite remote sensing image to be processed in the time sequence image set to be used as the time sequence water body atlas. In step S12, the first vegetation map is extracted from the target color image according to the dominant wavelength and purity, in this embodiment, according to the colors of the vegetation in the target pseudo-color image, the target apparent color image or the target standard pseudo-color image, the pixel points with the dominant wavelength less than 500nm and the purity less than 0.35 are screened from the target color image, and the pixel value is marked as 1, and for the pixel points that do not meet the condition, the pixel value is marked as 0, and the binary vegetation map is extracted as the third vegetation map.
And step S303, calculating a water body pixel frequency chart according to the time sequence water body chart set.
In this embodiment, a plurality of third vegetation graphs are obtained from the time sequence water body graph set, and pixel values of pixel points at the same position in each third vegetation graph are summed to obtain a water body pixel number graph. Illustratively, when in the pixel points at the same position of the 5 third vegetation graphs, the pixel value of the pixel point at the same position in the water body pixel number graph is 3 if the pixel value of the 3 entries is 1 and the pixel value of the 2 entries is 0.
And step S304, calculating a water body frequency chart according to the water body pixel frequency chart.
In this embodiment, in the water body pixel number chart, the pixel value of each pixel point is divided by M to obtain a water body frequency chart. Wherein M is the number of items of the third vegetation map. Illustratively, when the water body pixel number map is calculated from the 5 third vegetation maps, the pixel value of each pixel point in the water body pixel number map is divided by 5 to obtain a water body frequency map, where the pixel value of the pixel point in the water body frequency map may be a decimal.
And step S305, calculating a water body template diagram according to the water body frequency diagram.
In this embodiment, in the water body frequency chart, judging the pixel value of each pixel point, and if the pixel value accords with a preset condition, marking the pixel value of the pixel point as 1; and if the water body template diagram does not accord with the water body template diagram, marking the water body template diagram as 0. For example, in the water body frequency chart, the pixel value of each pixel point can be judged, and if the pixel value is greater than the preset condition of 0.5, the pixel value of the pixel point is marked as 1; if the pixel value is less than or equal to the preset condition 0.5, the pixel value of the pixel point is marked as 0. The pixel value of each pixel point in the water body frequency chart is judged in the judging mode, so that a water body template chart is obtained, and the calculation result is shown in fig. 8. According to the embodiment, the water body position points which occur at high frequency in the preset time period can be obtained through the water body template diagram, so that the interference of the low-frequency water body position points can be effectively reduced.
And S15, calculating a second vegetation map according to the first vegetation map and the water body template map.
In this embodiment, according to the first vegetation map and the water template map, pixel values of pixel points of the first vegetation map and the water template map are intersected to obtain a second vegetation map. For example, according to the first vegetation map and the water body template map, if the pixel values of the pixel points at the same position are not equal or are all 0, the pixel value of the pixel point is marked as 0; and if the pixel values of the pixel points at the same position are all 1, marking the pixel value of the pixel point as 1 to obtain a second vegetation map. In the embodiment, through the intersection of the first vegetation map and the water body template map, the pixels of the terrestrial vegetation in the first vegetation map can be removed.
And S16, extracting a target vegetation map from the second vegetation map.
In this embodiment, a SIEVE filtering tool of PCI image processing software is used to perform small patch filtering calculation, and small values are removed according to a preset threshold by counting the number of similar pixels around each pixel in the second vegetation image, so that isolated vegetation pixels can be removed, and a target vegetation image can be obtained after filtering treatment. By way of example, the value of the pixels with the number of surrounding similar pixels being less than 3 is marked as 0, so that the isolated vegetation pixels can be removed, and a target vegetation image can be obtained. In this embodiment, when the second vegetation map is calculated from the target pseudo-color image, the extraction result of extracting the target vegetation map from the second vegetation map is shown in fig. 9; when the second vegetation map is calculated from the target apparent color image, the extraction result of extracting the target vegetation map from the second vegetation map is shown in fig. 10. In the embodiment, the filtering tool is used for removing the isolated vegetation pixels, so that the erroneously identified isolated vegetation position points can be removed, and the identification accuracy of the target vegetation map is improved.
The embodiment of the invention has the beneficial effects that: according to the method, a satellite remote sensing image to be processed is firstly obtained, a dominant wavelength map and a purity map are calculated according to the satellite remote sensing image to be processed, a first vegetation map is extracted, a preset time period is then obtained, a time sequence image set is obtained according to the preset time period, a dominant wavelength map and a purity map of each time-phase satellite remote sensing image in the time sequence image set are calculated, a third vegetation map is extracted, a time sequence water body map set is formed, a water body template map is calculated, a second vegetation map is calculated according to the first vegetation map and the water body template map through intersection, and finally a target vegetation map is extracted from the second vegetation map, so that the identification of floating plants is realized, the influence of labor cost and threshold value is reduced, and the identification accuracy and efficiency are improved.
According to the method, a time sequence image set is selected to form a dominant wavelength chart and a purity chart of each period, a binarized water chart of each period image is extracted by combining the condition that the dominant wavelength is less than 500nm and the purity is less than 0.35, then a water image element number chart is calculated through summation, a water frequency chart is calculated based on the item number of vegetation of a third vegetation chart in the time sequence water chart, and judgment is made that each element value in the water frequency chart is greater than 0.5, so that a water template chart is generated. The intersection of the water body template diagram and the first vegetation diagram is obtained, the floating vegetation diagram is extracted as a target vegetation diagram, the influence of terrestrial vegetation can be effectively eliminated, the distinguishing degree of terrestrial plants and floating plants is improved, and the identification precision of the floating plants is improved. Meanwhile, the method synthesizes the wave band information into a pseudo-color image, sets a threshold value according to the chromaticity index, extracts the water body and the floating plant, has the advantage of crossing sensors, and is suitable for different observation platforms, different types and different resolution data of satellites and unmanned aerial vehicles, and the method comprises synthetic aperture radar images, multispectral optical images and RGB natural images with more than two polarization modes. According to the method, the water body and the floating plant are extracted according to the dominant wavelength and the purity chromaticity index of the image, the threshold value is determined according to the image color of the target ground object, the meaning is clear, the immunity is strong, the satellite optical image can resist the atmospheric influence, and the atmospheric correction is not needed.
The embodiment of the invention also provides a multisource satellite image floating plant identification system based on the chromaticity index, which comprises the following steps:
The first module is used for acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image;
The second module is used for calculating a first vegetation graph based on the chromaticity index according to the satellite remote sensing image to be processed;
a third module, configured to obtain a preset time period;
A fourth module, configured to calculate a water template map according to a preset time period;
the fifth module is used for calculating a second vegetation map according to the first vegetation map and the water body template map;
And a sixth module for extracting a target vegetation map from the second vegetation map.
The content in the method embodiment is applicable to the system embodiment, the functions specifically realized by the system embodiment are the same as those of the method embodiment, and the achieved beneficial effects are the same as those of the method embodiment.
While the preferred embodiment of the present invention has been described in detail, the present invention is not limited to the above embodiment, and various equivalent modifications and substitutions can be made by those skilled in the art without departing from the spirit of the present invention, and these equivalent modifications and substitutions are intended to be included in the scope of the present invention as defined in the appended claims.

Claims (6)

1. The multisource satellite image floating plant identification method based on the chromaticity index is characterized by comprising the following steps of:
Acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image;
calculating a first vegetation map based on a chromaticity index according to the satellite remote sensing image to be processed;
acquiring a preset time period;
Calculating a water body template diagram according to the preset time period;
calculating a second vegetation map according to the first vegetation map and the water body template map;
extracting a target vegetation map from the second vegetation map;
the calculating a first vegetation graph based on the chromaticity index according to the satellite remote sensing image to be processed comprises:
Synthesizing a target color image according to the satellite remote sensing image to be processed;
calculating dominant wavelength and purity based on chromaticity index according to the target color image;
extracting the first vegetation map from the target color image according to the dominant wavelength and the purity;
the chromaticity index includes a target stimulus value, a chromaticity coordinate to be processed, or a chromaticity point distance to be processed, and the calculating the dominant wavelength and purity according to the target color image includes:
Calculating the target stimulus value according to the target color image, wherein the target stimulus value comprises a first stimulus value, a second stimulus value or a third stimulus value;
calculating the chromaticity coordinates to be processed according to the target stimulus value;
Calculating the distance and purity of the chromaticity point to be processed according to the chromaticity coordinate to be processed;
calculating dominant wavelength according to the distance between the chromaticity points to be processed;
The chromaticity coordinates to be processed comprise spectrum locus chromaticity coordinates, iso-energy white light point chromaticity coordinates or first chromaticity coordinates, and the calculating the chromaticity point distance to be processed according to the chromaticity coordinates to be processed comprises the following steps:
Calculating a slope according to the chromaticity coordinates of the iso-energy white light point and the first chromaticity coordinates, wherein the calculation formula of the slope is as follows:
Wherein k is the slope, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
Calculating a to-be-processed chromaticity point distance according to the slope, the spectrum locus chromaticity coordinate, the equivalent white light point chromaticity coordinate and the first chromaticity coordinate, wherein a calculation formula of the to-be-processed chromaticity point distance is as follows:
Wherein d is the distance between the chromaticity points to be processed, k is the slope, x λ is the abscissa of the chromaticity coordinates of the spectrum locus, y c is the ordinate of the chromaticity coordinates of the first chromaticity, and y s is the ordinate of the chromaticity coordinates of the equivalent white light point;
the calculating the purity according to the chromaticity coordinates to be processed comprises the following steps:
And calculating a first distance according to the chromaticity coordinates of the spectrum locus and the chromaticity coordinates of the equivalent white light point, wherein the calculation formula of the first distance is as follows:
Wherein SD is the first distance, x λ is the abscissa of the chromaticity coordinate of the spectrum trace, y λ is the ordinate of the chromaticity coordinate of the spectrum trace, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
and calculating a second distance according to the first chromaticity coordinate and the chromaticity coordinate of the equivalent white light point, wherein the calculation formula of the second distance is as follows:
Wherein SC is the second distance, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
and calculating purity according to the first distance and the second distance, wherein the calculation formula of the purity is as follows:
P=SC/SD
Wherein P is the purity, SC is the second distance, and SD is the first distance.
2. The method for identifying a multi-source satellite image floating plant based on a chromaticity index as claimed in claim 1, wherein the calculating a water body template map according to the preset time period comprises:
acquiring a time sequence image set according to the preset time period, wherein the time sequence image set comprises a plurality of time treat each other for processing satellite remote sensing images;
Calculating a time sequence water body atlas according to the time sequence image set, wherein the time sequence water body atlas comprises a plurality of third vegetation figures;
calculating a water body pixel frequency chart according to the time sequence water body chart set;
Calculating a water body frequency chart according to the water body pixel frequency chart;
and calculating the water body template diagram according to the water body frequency diagram.
3. The method for identifying a multi-source satellite image floating plant based on a chromaticity index as claimed in claim 1, wherein said synthesizing a target color image from said satellite remote sensing image to be processed comprises:
preprocessing the satellite remote sensing image to be processed to obtain a target satellite remote sensing image;
synthesizing a color image to be processed according to the target satellite remote sensing image, wherein the color image to be processed comprises a pseudo-color image, a visual true color image or a standard pseudo-color image;
and carrying out image stretching on the color image to be processed to obtain a target color image.
4. The method for identifying a multi-source satellite image floating plant based on a chromaticity index as claimed in claim 1, wherein the calculating the chromaticity coordinates to be processed according to the target stimulus value comprises:
calculating a first normalized value according to the target stimulus value;
calculating a second normalized value according to the target stimulus value;
And calculating the chromaticity coordinates to be processed according to the first normalized value and the second normalized value.
5. The method for identifying a multi-source satellite image floating plant based on a chromaticity index as claimed in claim 1, wherein the chromaticity point distance to be processed includes a first chromaticity point distance or a second chromaticity point distance, and the calculating the dominant wavelength according to the chromaticity point distance to be processed includes:
acquiring a first wavelength corresponding to the first chromaticity point distance and a second wavelength corresponding to the second chromaticity point distance;
calculating a dominant wavelength according to the first chromaticity point distance, the first wavelength, the second chromaticity point distance and the second wavelength, wherein the dominant wavelength has the following calculation formula:
Where λ d is the dominant wavelength, d 1 is the first chromaticity point distance, λ 1 is the first wavelength, d 2 is the second chromaticity point distance, and λ 2 is the second wavelength.
6. Multisource satellite image floating plant identification system based on chromaticity index, which is characterized by comprising:
The first module is used for acquiring a satellite remote sensing image to be processed, wherein the satellite remote sensing image to be processed comprises a synthetic aperture radar satellite remote sensing image or an optical satellite remote sensing image;
The second module is used for calculating a first vegetation graph based on a chromaticity index according to the satellite remote sensing image to be processed;
a third module, configured to obtain a preset time period;
A fourth module, configured to calculate a water template map according to the preset time period;
a fifth module for calculating a second vegetation map according to the first vegetation map and the water template map;
A sixth module for extracting a target vegetation map from the second vegetation map;
the calculating a first vegetation graph based on the chromaticity index according to the satellite remote sensing image to be processed comprises:
Synthesizing a target color image according to the satellite remote sensing image to be processed;
calculating dominant wavelength and purity based on chromaticity index according to the target color image;
extracting the first vegetation map from the target color image according to the dominant wavelength and the purity;
the chromaticity index includes a target stimulus value, a chromaticity coordinate to be processed, or a chromaticity point distance to be processed, and the calculating the dominant wavelength and purity according to the target color image includes:
Calculating the target stimulus value according to the target color image, wherein the target stimulus value comprises a first stimulus value, a second stimulus value or a third stimulus value;
calculating the chromaticity coordinates to be processed according to the target stimulus value;
Calculating the distance and purity of the chromaticity point to be processed according to the chromaticity coordinate to be processed;
calculating dominant wavelength according to the distance between the chromaticity points to be processed;
The chromaticity coordinates to be processed comprise spectrum locus chromaticity coordinates, iso-energy white light point chromaticity coordinates or first chromaticity coordinates, and the calculating the chromaticity point distance to be processed according to the chromaticity coordinates to be processed comprises the following steps:
Calculating a slope according to the chromaticity coordinates of the iso-energy white light point and the first chromaticity coordinates, wherein the calculation formula of the slope is as follows:
Wherein k is the slope, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
Calculating a to-be-processed chromaticity point distance according to the slope, the spectrum locus chromaticity coordinate, the equivalent white light point chromaticity coordinate and the first chromaticity coordinate, wherein a calculation formula of the to-be-processed chromaticity point distance is as follows:
Wherein d is the distance between the chromaticity points to be processed, k is the slope, x λ is the abscissa of the chromaticity coordinates of the spectrum locus, y c is the ordinate of the chromaticity coordinates of the first chromaticity, and y s is the ordinate of the chromaticity coordinates of the equivalent white light point;
the calculating the purity according to the chromaticity coordinates to be processed comprises the following steps:
And calculating a first distance according to the chromaticity coordinates of the spectrum locus and the chromaticity coordinates of the equivalent white light point, wherein the calculation formula of the first distance is as follows:
Wherein SD is the first distance, x λ is the abscissa of the chromaticity coordinate of the spectrum trace, y λ is the ordinate of the chromaticity coordinate of the spectrum trace, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
and calculating a second distance according to the first chromaticity coordinate and the chromaticity coordinate of the equivalent white light point, wherein the calculation formula of the second distance is as follows:
Wherein SC is the second distance, x c is the abscissa of the first chromaticity coordinate, y c is the ordinate of the first chromaticity coordinate, x s is the abscissa of the chromaticity coordinate of the iso-energy white light point, and y s is the ordinate of the chromaticity coordinate of the iso-energy white light point;
and calculating purity according to the first distance and the second distance, wherein the calculation formula of the purity is as follows:
P=SC/SD
Wherein P is the purity, SC is the second distance, and SD is the first distance.
CN202311170191.1A 2023-09-12 2023-09-12 Multi-source satellite image floating plant identification method and system based on chromaticity index Active CN117152634B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311170191.1A CN117152634B (en) 2023-09-12 2023-09-12 Multi-source satellite image floating plant identification method and system based on chromaticity index

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311170191.1A CN117152634B (en) 2023-09-12 2023-09-12 Multi-source satellite image floating plant identification method and system based on chromaticity index

Publications (2)

Publication Number Publication Date
CN117152634A CN117152634A (en) 2023-12-01
CN117152634B true CN117152634B (en) 2024-06-04

Family

ID=88911848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311170191.1A Active CN117152634B (en) 2023-09-12 2023-09-12 Multi-source satellite image floating plant identification method and system based on chromaticity index

Country Status (1)

Country Link
CN (1) CN117152634B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409330B (en) * 2023-12-15 2024-03-29 中山大学 Aquatic vegetation identification method, aquatic vegetation identification device, computer equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361602A (en) * 2014-11-26 2015-02-18 中国科学院遥感与数字地球研究所 Water color detecting method and device based on MODIS image
CN104851113A (en) * 2015-04-17 2015-08-19 华中农业大学 Urban vegetation automatic extraction method of multiple-spatial resolution remote sensing image
CN112102312A (en) * 2020-09-29 2020-12-18 滁州学院 Moso bamboo forest remote sensing identification method based on satellite image and phenological difference containing red edge wave band
CN112396019A (en) * 2020-11-27 2021-02-23 佛山市墨纳森智能科技有限公司 Vegetation distribution identification method and system based on unmanned aerial vehicle and readable storage medium
CN114419436A (en) * 2022-01-12 2022-04-29 南方科技大学 Bloom area monitoring method, bloom area monitoring device, bloom area monitoring apparatus, storage medium, and program product
CN115908864A (en) * 2022-11-10 2023-04-04 国家海洋环境监测中心 MODIS image-based seawater color angle extraction method and water color condition remote sensing classification method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156492A1 (en) * 2020-11-18 2022-05-19 Satsure Analytics India Private Limited System for producing satellite imagery with high-frequency revisits using deep learning to monitor vegetation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361602A (en) * 2014-11-26 2015-02-18 中国科学院遥感与数字地球研究所 Water color detecting method and device based on MODIS image
CN104851113A (en) * 2015-04-17 2015-08-19 华中农业大学 Urban vegetation automatic extraction method of multiple-spatial resolution remote sensing image
CN112102312A (en) * 2020-09-29 2020-12-18 滁州学院 Moso bamboo forest remote sensing identification method based on satellite image and phenological difference containing red edge wave band
CN112396019A (en) * 2020-11-27 2021-02-23 佛山市墨纳森智能科技有限公司 Vegetation distribution identification method and system based on unmanned aerial vehicle and readable storage medium
CN114419436A (en) * 2022-01-12 2022-04-29 南方科技大学 Bloom area monitoring method, bloom area monitoring device, bloom area monitoring apparatus, storage medium, and program product
CN115908864A (en) * 2022-11-10 2023-04-04 国家海洋环境监测中心 MODIS image-based seawater color angle extraction method and water color condition remote sensing classification method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A new technique for quantifying algal bloom, floating/emergent and submerged vegetation in eutrophic shallow lakes using Landsat imagery;Juhua Luo等;《Remote Sensing of Environment》;20230201;第1-19页 *
Juhua Luo等.A new technique for quantifying algal bloom, floating/emergent and submerged vegetation in eutrophic shallow lakes using Landsat imagery.《Remote Sensing of Environment》.2023,第1-19页. *

Also Published As

Publication number Publication date
CN117152634A (en) 2023-12-01

Similar Documents

Publication Publication Date Title
Helmer et al. Cloud-free satellite image mosaics with regression trees and histogram matching
CN113033670B (en) Rice planting area extraction method based on Sentinel-2A/B data
CN117152634B (en) Multi-source satellite image floating plant identification method and system based on chromaticity index
EP1091188A1 (en) Method for correcting atmospheric influences in multispectral optical teledetection data
JP2012196167A (en) Plant species identification method
CN112183209A (en) Regional crop classification method and system based on multi-dimensional feature fusion
Szantoi et al. Fast and robust topographic correction method for medium resolution satellite imagery using a stratified approach
Kruse et al. Techniques developed for geologic analysis of hyperspectral data applied to near-shore hyperspectral ocean data
CN111553922A (en) Automatic cloud detection method for satellite remote sensing image
CN110703244A (en) Method and device for identifying urban water body based on remote sensing data
CN111274871B (en) Forest fire damage degree extraction method based on light and small unmanned aerial vehicle
CN113744249A (en) Marine ecological environment damage investigation method
Alwashe et al. Monitoring vegetation changes in Al Madinah, Saudi Arabia, using thematic mapper data
CN114419458A (en) Bare soil monitoring method, device and equipment based on high-resolution satellite remote sensing
CN111353402B (en) Remote sensing extraction method for oil palm forest
Belfiore et al. Orthorectification and pan-sharpening of worldview-2 satellite imagery to produce high resolution coloured ortho-photos
Raines Digital color analysis of color-ratio composite Landsat scenes
CN109461152B (en) Healthy vegetation detection method
Ceballos et al. Technical note The discrimination of scenes by principal components analysis of multi-spectral imagery
Imai et al. Shadow detection in hyperspectral images acquired by UAV
CN115019190A (en) Terrain broken region complex terrain information extraction method based on aerial remote sensing platform
CN113469104A (en) Radar remote sensing image surface water body change detection method and device based on deep learning
CN117152602A (en) Multisource satellite remote sensing image water body extraction method and system based on chromaticity index
Jaquet et al. Colour analysis of inland waters using Landsat TM data
Rabatel et al. A fully automatized processing chain for high-resolution multispectral image acquisition of crop parcels by UAV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant