CN114663434A - Shadow discrimination method of side-scan sonar image - Google Patents

Shadow discrimination method of side-scan sonar image Download PDF

Info

Publication number
CN114663434A
CN114663434A CN202210571596.5A CN202210571596A CN114663434A CN 114663434 A CN114663434 A CN 114663434A CN 202210571596 A CN202210571596 A CN 202210571596A CN 114663434 A CN114663434 A CN 114663434A
Authority
CN
China
Prior art keywords
value
gray
peak
effective
shadow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210571596.5A
Other languages
Chinese (zh)
Other versions
CN114663434B (en
Inventor
刘大川
马龙
严晋
王文广
王雪
王晓丹
马治忠
杨德鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tara Marine Technology Co ltd
Beihai Ocean Technology Assurance Center Of State Oceanic Administration People's Republic Of China
Original Assignee
Shanghai Tara Marine Technology Co ltd
Beihai Ocean Technology Assurance Center Of State Oceanic Administration People's Republic Of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tara Marine Technology Co ltd, Beihai Ocean Technology Assurance Center Of State Oceanic Administration People's Republic Of China filed Critical Shanghai Tara Marine Technology Co ltd
Priority to CN202210571596.5A priority Critical patent/CN114663434B/en
Publication of CN114663434A publication Critical patent/CN114663434A/en
Application granted granted Critical
Publication of CN114663434B publication Critical patent/CN114663434B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention relates to the technical field of side scan sonars, in particular to a method for judging the shadow of a side scan sonar image, which uses a mode of combining curve analysis and gray peak value ratio for screening and interpretation and comprises the following steps: s1, reading a side scan sonar image; s2, generating a statistical curve; s3, screening out gray peak points; s4, selecting a gray contrast value; s5, defining a shadow gray peak value; and S6, judging the peak value ratio. The method can judge whether the side-scan image of the side-scan sonar has the shadow or not, and avoids the influence of the shadow on the further image segmentation process.

Description

Shadow discrimination method of side-scan sonar image
Technical Field
The invention relates to the technical field of side-scan sonars, in particular to a method for judging the shadow of a side-scan sonar image.
Background
The side-scan sonar images are widely applied to the fields of underwater target searching, obstacle detection and the like. Especially, due to the fact that the arrangement density of important strategic channel torpedoes, mines and the like is increased in recent years, the requirements of side scan sonar image segmentation and target identification are more and more vigorous.
The invention patent with application publication number CN110675410A discloses a side-scan sonar sunken ship target unsupervised detection method based on a selective search algorithm, which comprises the following steps: preprocessing a side scan sonar strip waterfall image; dividing the strip waterfall image into a water column area, a target area (shadow area) and a pure seabed background area based on priori knowledge according to the basic characteristics of the side scan sonar, and dividing the side scan sonar waterfall image into the areas based on a selective search strategy; defining a plurality of similarity measures, calculating the similarity measure of each area and taking a weighted value as a final measure value; and outputting a sunken ship target detection result.
The technical scheme relates to the division of an image into a water column area, a target area (shadow area) and a pure seabed background area, and the purpose of the division is that when the side scan sonar image has a shadow, the division result is seriously influenced. Therefore, it is necessary to determine whether the side-scan sonar image contains a shadow before image segmentation. However, at present, an algorithm for judging the shadow of the side-scan sonar image is not specially designed, and in order to improve the batch judgment efficiency, it is necessary to develop an automatic judgment algorithm for the shadow of the side-scan sonar image.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides a method for judging the shadow of a side-scan sonar image.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for judging the shadow of a side-scan sonar image uses a mode of combining curve analysis and gray peak value ratio to carry out screening and interpretation, and comprises the following steps:
s1, reading the side scan image: reading a side scan sonar image, and then carrying out gray level statistics;
s2, generating a statistical curve: generating a gray scale statistical curve according to the gray scale statistical result of S1, and obtaining each peak point on the gray scale curve;
s3, screening peak points: screening the peak value points obtained in the step S2, eliminating interference peak value points, and reserving effective peak value points E, wherein the effective peak value points E comprise effective horizontal coordinates X and effective vertical coordinates Y, wherein X represents a gray value, and Y represents a peak value of gray statistics;
s4, selecting a gray contrast value: selecting a gray contrast value M from the gray value interval [35,75 ];
s5, defining shadow gray peak value: by comparing the gray contrast value M in S4 with the effective abscissa X of all the effective peaks E, there are several cases:
if all the effective horizontal coordinates X are larger than or equal to the gray contrast value M, judging that the side-scan image is free of shadow;
if a part of effective abscissa X is larger than or equal to the gray contrast value M, and the rest effective abscissas X are smaller than the gray contrast value M, eliminating the former part of effective abscissa X, selecting the maximum value in the latter part of effective abscissa X, defining the effective ordinate Y corresponding to the effective abscissa X as P1, and selecting the effective ordinate Y with the maximum peak value in the statistical curve as P2;
s6, peak ratio judgment: and performing proportion analysis on the peak value obtained in the step S4, wherein the proportion analysis is based on the mathematical formula as follows:
R= P1÷P2
wherein, R is the shadow gray peak value ratio, P1 is the maximum peak value of S4 with gray smaller than the gray contrast value M, and P2 is the maximum peak value of the statistical curve obtained in S4;
when the numerical value of R is judged, there are several cases:
if R is larger than 0.1, judging that the side-scan image has a shadow;
if R ≦ 0.1, the side-swept image is determined to be unshaded.
Preferably, in S2, the signal processing method is used to obtain the gray peak point.
Preferably, in S3, the method for screening out each peak point on the gray scale curve uses a significance analysis method, and the significance analysis methodThe method for analyzing the writability comprises reading data information of each peak point, wherein the data information comprises transverse coordinatesx i And longitudinal coordinatesy i The significance analysis method comprises the following steps:
s3.1, determining a left crossing area and a left area low point of each peak point;
s3.2, determining a right crossing area and a right area low point of each peak point;
s3.3, calculating a significant value H of each peak point;
s3.4, selecting a significant contrast value C;
s3.5, flat peak elimination: comparing the significance value H with the significance comparison value C, there are several cases:
if H is larger than or equal to C, reserving a peak point corresponding to the significant value H;
and if H is less than C, eliminating the peak point corresponding to the significant value H.
Preferably, the significance contrast value C is 40.
The invention has the beneficial effects that:
1. the problem that the shadow in the side-scan image cannot be judged in the prior art is solved, and the shadow can be judged for the image of the side-scan sonar, so that further segmentation is facilitated, and the failure of image segmentation caused by the shadow is avoided.
2. Invalid peak points are removed through significance analysis in the discrimination process, interference of useless data is avoided, and the fault tolerance rate is further improved.
3. The needed calculation in the process of distinguishing is less, and the side-scanned images can be quickly distinguished in batch.
Drawings
FIG. 1 is an overall step diagram of a method for determining the shadow of a side-scan sonar image according to the present invention;
fig. 2 is a diagram of the overall steps of the significance analysis in S3;
FIG. 3 is a graphical illustration of a significance analysis;
FIG. 4 is a logic diagram of S5 and S6;
FIG. 5 is a first side-scan image according to one embodiment;
FIG. 6 is a gray scale statistical curve of the first side-scan image;
FIG. 7 is a statistical curve of the gray scale statistical curve of the first side-scan image after significance analysis;
FIG. 8 is a second side-scan image according to the second embodiment;
FIG. 9 is a gray scale statistical curve of a second side-scan image;
FIG. 10 is a statistical curve of the gray scale statistical curve of the second side-scan image after significance analysis;
FIG. 11 is a third side-scan image according to the third embodiment;
FIG. 12 is a gray scale statistical curve of the third side-scan image;
fig. 13 is a statistical curve of the third side-scan image after the gray scale statistical curve is subjected to significance analysis.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Example one
The invention discloses a method for judging the shadow of a side-scan sonar image, which is based on the combination of curve analysis and gray peak ratio. The gray-peak value statistical curve is obtained by analyzing the side-swept image, and then comprehensive analysis is carried out, so that whether the shadow part exists in the image can be judged.
Referring to fig. 1, the overall steps of the method include the following steps:
s1, reading the side scan image: reading an image generated by the side scan sonar, and starting to perform subsequent gray level statistics;
s2, generating a statistical curve: and generating a gray scale statistic curve according to the gray scale statistic result of the S1, and obtaining each peak point on the gray scale curve. The signal processing method is adopted for obtaining the peak points, the signal processing method is an effective method for obtaining the peak points, all the peak points can be obtained quickly, and the signal processing method is the prior art and is not repeated. The X axis of the generated statistical curve is a gray value, the Y axis of the generated statistical curve is a gray value statistical peak value, and the peak value is the number of pixel points under the current gray value;
s3, screening peak points: screening the gray peak points obtained in the step S2, eliminating interference peak points, and reserving effective peak points E, wherein each effective peak point E comprises an effective abscissa X and an effective ordinate Y;
referring to fig. 2, it should be noted that the screening of the peak points employs a saliency analysis, which is to check whether each peak point really represents a prominent peak, rather than a flat peak with small fluctuation. The significance analysis method specifically comprises the following steps:
and S3.1, determining a left crossing area and a left area low point of each peak point. The left crossing region is that the peak point extends to the left by a horizontal line until the intersection point with the statistical curve is generated or the statistical curve is exceeded. Referring to fig. 3, there are two peak points, i.e., m and n, respectively, where the m extends to the left of the horizontal line without any intersection, and the lower point of the left area of the m is the minimum point of the value on the left-side curve of the m, i.e., point a in the figure. The point n extends to the left to form a horizontal line, which forms an intersection point d with the curve, and the lower point of the left area of the point n is the minimum point of the values on the curve from the point n to the point d, i.e. the point b in the figure.
And S3.2, determining a right crossing area and a right region low point of each peak point. The definition of the right cross point and the right zone low point can be analogized to the left cross area and the left zone low point, and will not be described again. The lower point of the right area of the m point is a point b, and the lower point of the right area of the n point is a point c.
And S3.3, calculating the significance value H of each peak point. Firstly, comparing the sizes of the left area low point and the right area low point of each peak value point, and selecting a larger value as a reference point. Therefore, for the m point, the a point is a reference point; for point n, point c is the reference point. Then, the reference point extends out of the horizontal line, and the vertical distance between the peak point and the horizontal line is calculated to be the significant value H.
And S3.4, selecting a significant contrast value C. The significance comparison value C is used for judging the significance of the peak point, the value of the significance comparison value C can be automatically adjusted according to the precision requirement, and in this embodiment and subsequent embodiments, the value of the significance comparison value C is 40.
S3.5, flat peak elimination: comparing the significance value H with the significance comparison value C, there are several cases:
if H is larger than or equal to C, reserving a peak point corresponding to the significant value H;
and if H is less than C, eliminating the peak point corresponding to the significant value H.
In fig. 3, the m-point saliency value H is greater than the saliency contrast value C, so the m-point saliency value H is retained, and the n-point saliency value H is less than the saliency contrast value C, so the n-point saliency value H is culled.
S4, selecting a gray contrast value: the gray contrast value M is selected from the gray value interval [35,75 ]. In the actual selection process, the smaller the selected gray scale contrast value M is, the more rigorous the shadow judgment condition is, and the larger the selected gray scale contrast value M is, the looser the shadow judgment condition is. The gray scale contrast value M can be determined according to the judgment standard, in this embodiment, the gray scale contrast value M is selected from 65;
s5, defining shadow gray peak value: by comparing the gray contrast value M in S4 with the abscissa X of all the effective peaks E, there are several cases:
referring to fig. 4, if all the effective abscissa X is greater than or equal to the gray-scale contrast value M, it is determined that the side-scan image is shadowless. If the effective abscissa X is greater than or equal to the gray contrast value M, it means that the color brightness of the pixel point corresponding to the effective abscissa X is high, and therefore the pixel point does not belong to the shadow color, and when all the pixel points of the side-scanned image do not belong to the shadow part, it can be determined that the side-scanned image has no shadow;
if a part of the effective abscissa X is greater than or equal to the gray-scale contrast value M and another part of the effective abscissa X is less than the gray-scale contrast value M, it means that there are a part with higher brightness and a part with lower brightness in the side-scan image. The part of pixels with the effective abscissa X larger than or equal to the gray contrast value M are the part with higher brightness, so the elimination is carried out. The remaining part of the effective abscissa X smaller than the gray contrast value M belongs to the part with lower brightness, and can be interpreted as a shadow.
Firstly, selecting an effective abscissa X which is closest to a gray contrast value M in a part of coordinate points of which the effective abscissa X is smaller than the gray contrast value M, namely the maximum value in the part of effective abscissas X, defining an effective ordinate Y corresponding to the effective abscissa X as P1, and then selecting an effective ordinate Y with the maximum peak value in a statistical curve as P2;
s6, peak ratio judgment: and performing proportion analysis on the peak value obtained in the step S4, wherein the proportion analysis is based on the mathematical formula as follows:
R= P1÷P2
where R is the shade gray peak ratio, P1 is the peak closest to the gray contrast value M obtained in S4, and P2 is the maximum peak in the statistical curve obtained in S4. The purpose of calculating the shadow gray peak ratio is to obtain the proportional relation between the number of the pixel points with the gray closest to the gray contrast value M and the maximum peak value in the whole side-scan image in the image.
Referring to FIG. 4, if R > 0.1, i.e., P1: P2 > 10%, it is determined whether the area ratio of the luminance is low or high, and it is determined that the side-scan image has a shadow; if R < 0.1, i.e., P1: P2 < 10%, indicates that the area ratio of lower luminance is low, the side-scan image is determined to be shadow-free.
The whole procedure is explained below with reference to examples:
referring to fig. 5, a first side-scan image is used for shadow determination.
Referring to fig. 6, a gray statistical curve is obtained after processing by the signal processing method, and includes 9 peak points.
With reference to fig. 7, a saliency analysis is performed on each peak point, the process of the saliency analysis has been described previously, and is not described in detail here, and a peak point with a saliency less than 40 is not adopted, and 3 effective peaks E remain after being removed. The point coordinate No. 1 (33,1254), the point coordinate No. 2 (117, 2317) and the point coordinate No. 3 (228,173) represent that the gray scale values corresponding to the three points are 33, 117 and 228, respectively. In this embodiment, the gray contrast value M is 65, and since 33 is less than 65, 117 is greater than 65, and 228 is greater than 65, the effective ordinate Y of point No. 1 is P1, that is, P1 is 1254. Since point 2 is the maximum peak of the gray statistics of the entire side-scan image, the effective ordinate Y of point 2 is P2, i.e., P2 is 2317.
The calculated shadow gradation ratio R is 1254 ÷ 2317 ≈ 0.5412 ≈ 54.12%, and since 54.12% is greater than 10%, it can be determined that the side-scan image has a shadow.
Example two
Referring to fig. 8, a second side-scan image is used for shadow determination.
Referring to fig. 9, a gray scale statistical curve is obtained after processing by the signal processing method, and includes 16 peak points.
With reference to fig. 10, the saliency analysis is performed on each peak point, and the process of the saliency analysis has already been described previously, and is not described in detail here, and 2 effective peaks E remain after the elimination for peak points with saliency less than 40. The coordinates of point 4 (148,4728) and point 5 (156, 4542) represent the gray values corresponding to these two points as 148 and 156, respectively. In this embodiment, the gray contrast value M is 75, and 148 > 75 and 156 > 75, so that it is determined that there is no area with a small gray value in the side-scan image, and it is directly determined that there is no shadow in the side-scan image.
EXAMPLE III
Referring to fig. 11, a third side-scan image is used for shadow determination.
Referring to fig. 12, a gray scale statistic curve is obtained after processing by the signal processing method, and includes 4 peak points.
Referring to fig. 13, the saliency analysis is performed on each peak point, the peak point with the saliency less than 40 is not adopted, and 3 effective peaks E remain after the elimination, wherein the coordinates of point 6 (15,669), point 7 (60, 183), and point 8 (135,2297) represent that the gray values corresponding to the three points are 15, 60, and 135, respectively. In this embodiment, the value of the gray contrast value M is 35, and since 15 is less than 35, 60 is greater than 35, and 135 is greater than 35, the effective ordinate Y of the point No. 6 is P1, that is, P1 is 669. Point 8 is the maximum peak value of the gray scale statistics of the entire side scan image, so the effective ordinate Y of point 8 is P2, i.e., P2 is 2297.
The calculated shade gradation ratio R is 669 ÷ 2297 ≈ 0.2912 ≈ 29.12%, and since 29.12% is greater than 10%, it can be determined that the side-scan image has a shade.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (4)

1. A method for judging the shadow of a side-scan sonar image is characterized in that: the screening and interpretation are carried out by using a mode of combining curve analysis and gray peak ratio, and the method comprises the following steps:
s1, reading the side scan image: reading a side-scan sonar image, and then carrying out gray statistics;
s2, generating a statistical curve: generating a gray scale statistical curve according to the gray scale statistical result of S1, and obtaining each peak point on the gray scale curve;
s3, screening peak points: screening the peak value points obtained in the step S2, eliminating interference peak value points, and reserving effective peak value points E, wherein the effective peak value points E comprise effective horizontal coordinates X and effective vertical coordinates Y, wherein X represents a gray value, and Y represents a peak value of gray statistics;
s4, selecting a gray contrast value: selecting a gray contrast value M from the gray value interval [35,75 ];
s5, defining shadow gray peak value: by comparing the gray contrast value M in S4 with the effective abscissa X of all the effective peaks E, there are several cases:
if all the effective horizontal coordinates X are larger than or equal to the gray contrast value M, judging that the side-scan image is free of shadow;
if a part of effective abscissa X is larger than or equal to the gray contrast value M, and the rest effective abscissas X are smaller than the gray contrast value M, eliminating the former part of effective abscissa X, selecting the maximum value in the latter part of effective abscissa X, defining the effective ordinate Y corresponding to the effective abscissa X as P1, and selecting the effective ordinate Y with the maximum peak value in the statistical curve as P2;
s6, peak ratio judgment: performing proportion analysis on the peak value obtained in the step S4, wherein the proportion analysis is based on a mathematical formula as follows:
R= P1÷P2
wherein, R is the shadow gray peak value ratio, P1 is the maximum peak value of S4 with gray smaller than the gray contrast value M, and P2 is the maximum peak value of the statistical curve obtained in S4;
judging the numerical value of R, the following conditions exist:
if R is more than 0.1, judging that the side-scanned image has a shadow;
if R ≦ 0.1, the side-scan image is determined to be unshaded.
2. The method for discriminating a shadow of a side-scan sonar image according to claim 1, comprising: in S2, a signal processing method is used to obtain each peak point on the gray scale curve.
3. The method for discriminating a shadow of a side-scan sonar image according to claim 1, comprising: in S3, a saliency analysis method is used to screen out each peak point on the gray scale curve, and the saliency analysis method is to read data information of each peak point, where the data information includes a horizontal coordinatex i And longitudinal coordinatesy i The significance analysis method comprises the following steps:
s3.1, determining a left crossing area and a left area low point of each peak point;
s3.2, determining a right crossing area and a right area low point of each peak point;
s3.3, calculating a significant value H of each peak point;
s3.4, selecting a significant contrast value C;
s3.5, flat peak elimination: comparing the significance value H with the significance comparison value C, there are several cases:
if H is more than or equal to C, reserving a peak point corresponding to the significant value H;
and if H is less than C, eliminating the peak point corresponding to the significant value H.
4. The method for discriminating a shadow of a side-scan sonar image according to claim 3, comprising: the significance contrast value C takes a value of 40.
CN202210571596.5A 2022-05-25 2022-05-25 Shadow discrimination method of side-scan sonar image Expired - Fee Related CN114663434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210571596.5A CN114663434B (en) 2022-05-25 2022-05-25 Shadow discrimination method of side-scan sonar image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210571596.5A CN114663434B (en) 2022-05-25 2022-05-25 Shadow discrimination method of side-scan sonar image

Publications (2)

Publication Number Publication Date
CN114663434A true CN114663434A (en) 2022-06-24
CN114663434B CN114663434B (en) 2022-08-23

Family

ID=82037580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210571596.5A Expired - Fee Related CN114663434B (en) 2022-05-25 2022-05-25 Shadow discrimination method of side-scan sonar image

Country Status (1)

Country Link
CN (1) CN114663434B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101621607A (en) * 2009-07-24 2010-01-06 天津三星光电子有限公司 Method for eliminating image shades of digital camera
CN102855627A (en) * 2012-08-09 2013-01-02 武汉大学 City remote sensing image shadow detection method based on spectral characteristic and topological relation
CN103324936A (en) * 2013-05-24 2013-09-25 北京理工大学 Vehicle lower boundary detection method based on multi-sensor fusion
CN105574529A (en) * 2016-01-28 2016-05-11 中国船舶重工集团公司第七一〇研究所 Target detection method of side scan sonar
CN105574888A (en) * 2016-03-01 2016-05-11 浙江工业大学 Crack position searching method based on gray peak value
CN105989611A (en) * 2015-02-05 2016-10-05 南京理工大学 Blocking perception Hash tracking method with shadow removing
CN106529592A (en) * 2016-11-07 2017-03-22 湖南源信光电科技有限公司 License plate recognition method based on mixed feature and gray projection
CN107341474A (en) * 2017-07-06 2017-11-10 淮海工学院 A kind of non-supervisory detection method of sidescan-sonar image target based on diffusion mapping
CN109035273A (en) * 2018-08-08 2018-12-18 华中科技大学 A kind of picture signal fast partition method of immune chromatography test card
CN109146806A (en) * 2018-07-29 2019-01-04 国网上海市电力公司 Gauge pointer position detection recognition methods based on shadow removing optimization in remote monitoriong of electric power
CN109559321A (en) * 2018-11-28 2019-04-02 清华大学 A kind of sonar image dividing method and equipment
CN109712212A (en) * 2018-12-20 2019-05-03 中国兵器科学研究院宁波分院 A kind of industry CT artifact correction method
CN110675410A (en) * 2019-09-25 2020-01-10 江苏海洋大学 Side-scan sonar sunken ship target unsupervised detection method based on selective search algorithm
CN110706177A (en) * 2019-09-30 2020-01-17 北京大学 Method and system for equalizing gray level of side-scan sonar image
CN111738278A (en) * 2020-06-22 2020-10-02 黄河勘测规划设计研究院有限公司 Underwater multi-source acoustic image feature extraction method and system
CN111915625A (en) * 2020-08-13 2020-11-10 湖南省有色地质勘查研究院 Energy integral remote sensing image terrain shadow automatic detection method and system
CN113052872A (en) * 2021-03-12 2021-06-29 浙江大学 Underwater moving object tracking method based on sonar image

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101621607A (en) * 2009-07-24 2010-01-06 天津三星光电子有限公司 Method for eliminating image shades of digital camera
CN102855627A (en) * 2012-08-09 2013-01-02 武汉大学 City remote sensing image shadow detection method based on spectral characteristic and topological relation
CN103324936A (en) * 2013-05-24 2013-09-25 北京理工大学 Vehicle lower boundary detection method based on multi-sensor fusion
CN105989611A (en) * 2015-02-05 2016-10-05 南京理工大学 Blocking perception Hash tracking method with shadow removing
CN105574529A (en) * 2016-01-28 2016-05-11 中国船舶重工集团公司第七一〇研究所 Target detection method of side scan sonar
CN105574888A (en) * 2016-03-01 2016-05-11 浙江工业大学 Crack position searching method based on gray peak value
CN106529592A (en) * 2016-11-07 2017-03-22 湖南源信光电科技有限公司 License plate recognition method based on mixed feature and gray projection
CN107341474A (en) * 2017-07-06 2017-11-10 淮海工学院 A kind of non-supervisory detection method of sidescan-sonar image target based on diffusion mapping
CN109146806A (en) * 2018-07-29 2019-01-04 国网上海市电力公司 Gauge pointer position detection recognition methods based on shadow removing optimization in remote monitoriong of electric power
CN109035273A (en) * 2018-08-08 2018-12-18 华中科技大学 A kind of picture signal fast partition method of immune chromatography test card
CN109559321A (en) * 2018-11-28 2019-04-02 清华大学 A kind of sonar image dividing method and equipment
CN109712212A (en) * 2018-12-20 2019-05-03 中国兵器科学研究院宁波分院 A kind of industry CT artifact correction method
CN110675410A (en) * 2019-09-25 2020-01-10 江苏海洋大学 Side-scan sonar sunken ship target unsupervised detection method based on selective search algorithm
CN110706177A (en) * 2019-09-30 2020-01-17 北京大学 Method and system for equalizing gray level of side-scan sonar image
CN111738278A (en) * 2020-06-22 2020-10-02 黄河勘测规划设计研究院有限公司 Underwater multi-source acoustic image feature extraction method and system
CN111915625A (en) * 2020-08-13 2020-11-10 湖南省有色地质勘查研究院 Energy integral remote sensing image terrain shadow automatic detection method and system
CN113052872A (en) * 2021-03-12 2021-06-29 浙江大学 Underwater moving object tracking method based on sonar image

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JASON RHINELANDER: "Feature extraction and target classification of side-scan sonar images", 《2016 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI)》 *
JUNWEI LI等: "A Local Region-Based Level Set Method With Markov Random Field for Side-Scan Sonar Image Multi-Level Segmentation", 《IEEE SENSORS JOURNAL》 *
叶秀芬等: "基于马尔可夫随机场的非监督声呐图像分割方法", 《哈尔滨工程大学学报》 *
李鹏: "侧扫声呐图像特征匹配方法研究", 《中国博士学位论文全文数据库 信息科技辑》 *
熊平波: "侧扫声呐中的伪彩色图像处理研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Also Published As

Publication number Publication date
CN114663434B (en) 2022-08-23

Similar Documents

Publication Publication Date Title
CN114937055B (en) Image self-adaptive segmentation method and system based on artificial intelligence
CN113689428B (en) Mechanical part stress corrosion detection method and system based on image processing
CN115439481B (en) Deaerator welding quality detection method based on image processing
CN108961235B (en) Defective insulator identification method based on YOLOv3 network and particle filter algorithm
CN115511889B (en) Method for detecting welding defects on surface of solar cell panel bracket
CN111626190A (en) Water level monitoring method for scale recognition based on clustering partitions
CN114842027A (en) Fabric defect segmentation method and system based on gray level co-occurrence matrix
CN115294140B (en) Hardware part defect detection method and system
CN111242026B (en) Remote sensing image target detection method based on spatial hierarchy perception module and metric learning
US6993187B2 (en) Method and system for object recognition using fractal maps
CN114820625B (en) Automobile top block defect detection method
CN112598054B (en) Power transmission and transformation project quality common disease prevention and detection method based on deep learning
CN113256624A (en) Continuous casting round billet defect detection method and device, electronic equipment and readable storage medium
CN115294159B (en) Method for dividing corroded area of metal fastener
CN116309599B (en) Water quality visual monitoring method based on sewage pretreatment
CN115100174B (en) Ship sheet metal part paint surface defect detection method
CN114782329A (en) Bearing defect damage degree evaluation method and system based on image processing
CN115170567B (en) Method for detecting defects of waterproof steel plate for ship
CN116883408B (en) Integrating instrument shell defect detection method based on artificial intelligence
CN110210316A (en) Traffic lights digit recognition method based on gray level image
CN114881965A (en) Wood board joint detection method based on artificial intelligence and image processing
CN114663434B (en) Shadow discrimination method of side-scan sonar image
CN116563786A (en) TEDS jumper fault identification detection method, storage medium and equipment
CN114627059B (en) Data processing-based stockbridge damper bolt detection method
CN115082444A (en) Copper pipe weld defect detection method and system based on image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220823