CN113537174A - Coral reef habitat survey video analysis method - Google Patents

Coral reef habitat survey video analysis method Download PDF

Info

Publication number
CN113537174A
CN113537174A CN202111084090.3A CN202111084090A CN113537174A CN 113537174 A CN113537174 A CN 113537174A CN 202111084090 A CN202111084090 A CN 202111084090A CN 113537174 A CN113537174 A CN 113537174A
Authority
CN
China
Prior art keywords
video
coral
frame
coral reef
coverage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111084090.3A
Other languages
Chinese (zh)
Other versions
CN113537174B (en
Inventor
刘辉
赵建民
逯文强
董志军
王清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yantai Institute of Coastal Zone Research of CAS
Original Assignee
Yantai Institute of Coastal Zone Research of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yantai Institute of Coastal Zone Research of CAS filed Critical Yantai Institute of Coastal Zone Research of CAS
Priority to CN202111084090.3A priority Critical patent/CN113537174B/en
Publication of CN113537174A publication Critical patent/CN113537174A/en
Application granted granted Critical
Publication of CN113537174B publication Critical patent/CN113537174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Farming Of Fish And Shellfish (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a coral reef habitat survey video analysis method, a storage medium and a terminal, and belongs to the field of coral reef habitat survey, wherein the coral reef habitat survey video analysis method comprises the following steps of 1: video acquisition is carried out on the coral reef habitat; step 2: semantically segmenting the collected coral reef habitat video frame by frame images, and calculating the coverage of different kinds of coral reefs frame by frame; and step 3: obtaining the weighting coefficient of the corresponding frame of the video; and 4, step 4: and estimating the total coverage of the coral reef habitat and the coverage of each coral reef by using the coverage of different kinds of coral reefs and the weighting coefficient of the corresponding video frame. The coral reef recognition method can be used for recognizing the coral reef by using an image processing technology, so that a large amount of underwater manual recognition and recording workload is saved, the automation degree is greatly improved, and the coral coverage calculation accuracy and the reliability are high.

Description

Coral reef habitat survey video analysis method
Technical Field
The invention relates to the field of coral reef habitat survey, in particular to a coral reef habitat survey video analysis method, a storage medium and a terminal.
Background
The coral reef is a biological reef, and the coral reef ecosystem is considered as a soul and a spirit of protection of the island reef. The coral reef ecosystem has abundant species and high productivity, and is known as a marine tropical rain forest. Under the multiple pressure of global climate change and human activities, the coral reef ecosystem is seriously degraded, and the monitoring and evaluation of the coral reef ecosystem are particularly important. The habitat characteristics of the coral reefs are important factors for determining the structure and distribution of biological communities such as fishes, and habitat evaluation is the basic content of coral reef ecosystem research.
The underwater video is an important means for acquiring the information of the habitat and the habitats community thereof, and comprises a diver holding a camera for sample point and sample zone investigation, an underwater robot and other cabled equipment for video investigation and the like. Patent numbers: 202110017362.1, the patent names: the Chinese patent of the invention discloses a cold water coral distribution prediction method and system based on sample selection expansion, which can effectively improve the accuracy of a prediction result.
In daily practice, the inventors found that the prior art solutions have the following problems: because divers or underwater robot swim fast inequality, the video of gathering often has the picture frame number of different sample zone interval gathers uneven, causes divers to swim fast inequality and brings certain influence to coral coverage estimation, and it is great to utilize the video to carry out coral reef coverage calculation error.
In view of the above, it is necessary to provide a new technical solution to solve the above problems.
Disclosure of Invention
In order to solve the above technical problem, the present application provides: a coral reef habitat survey video analysis method improves accuracy and reliability of coral reef coverage calculation by using videos, and comprises the following steps:
step 1: video acquisition is carried out on the coral reef habitat;
step 2: training a coral habitat image semantic segmentation model, carrying out frame-by-frame image semantic segmentation on the collected coral reef habitat video, and calculating the coverage of different kinds of coral reefs frame by frame; the semantic segmentation is to classify seawater, stones and corals of different types appearing in the coral habitat image;
and step 3: obtaining the weighting coefficient of the corresponding frame of the video;
and 4, step 4: and estimating the total coverage of the coral reef habitat and the coverage of each coral reef by using the coverage of different kinds of coral reefs and the weighting coefficient of the corresponding video frame.
Preferably, the step 1 comprises:
(11) laying sample belts in the coral reef area;
(12) and the diver shoots the coral reef habitat video along the sample belt.
Preferably, the sample band has a length of 50 to 100 meters.
Preferably, the step 2 comprises:
(21) selecting not less than 50 video screenshots containing all coral types appearing in the video; adopting Labelme labeling software to label different types of corals and non-coral seabed, and establishing a semantic segmentation data set; training and verifying the semantic segmentation data set by adopting a DeepLabV3+ model so as to generate a coral habitat image semantic segmentation model;
(22) performing semantic segmentation on the coral reef habitat video frame by using the built DeepLabV3+ model to generate a semantic graph of coral reef distribution of each frame;
(23) and calculating the coverage of different coral reefs frame by using the semantic graph.
Preferably, the coverage of the coral reef is the ratio of the pixel number of each coral reef to the total pixel number of the pictures.
Preferably, the step 3 comprises:
(31) extracting the direction and the amplitude of an optical flow vector of the characteristic point between every two frames of the video by adopting a sparse optical flow method;
(32) the method for calculating the direction of the optical flow vector of the feature point between two frames comprises the following steps:
dividing an included angle between an optical flow vector of the characteristic point between two frames and a horizontal right unit vector (1, 0) into 72 intervals by taking 5 degrees as an interval width, and classifying the optical flow vector of the characteristic point between two frames into different intervals according to the direction of the optical flow vector;
(33) accumulating the optical flow vector amplitudes in each interval;
(34) smoothing a one-dimensional vector obtained by accumulating the light stream vector amplitudes of the feature points between two frames in each interval by adopting Gaussian filtering;
(35) taking the maximum value in the optical flow vector accumulated amplitude values of the feature points between two frames in the interval of 50-130 degrees as the motion speed of the corresponding video between the frames, and recording the maximum value as V; taking the angle intermediate value of the interval where the maximum value of the optical flow vector accumulated amplitude of the feature points between two frames is located as the corresponding inter-frame video motion angle, and recording the angle intermediate value as theta;
(36) and calculating the weighting coefficient of the corresponding frame of the video as follows:
V·sinθ
preferably, the step 4 comprises:
(41) multiplying the coverage of the coral reefs of different types in the step 2 by the weighting coefficient of the corresponding video frame in the step 3 to obtain the coverage of the corrected coral reefs in each frame; the original coverage data of the first frame of the video is not processed, and the processing is started from the second frame;
(42) and calculating the average value of the coverage of the corrected coral reefs in each frame to obtain the total coverage of the coral reefs in the whole sample belt and the coverage of each coral reef.
Compared with the prior art, the application has at least the following beneficial effects:
(1) the image processing technology is used for identifying the coral reef, so that a large amount of workload of underwater manual identification and recording is saved, and the automation degree is greatly improved;
(2) compared with a sampling point method for coral reef coverage calculation, the sample belt method for coral reef coverage calculation can fully utilize all data information of sample belt investigation, and the result is more reliable;
(3) the coral reef habitat video collected in the sample zone is corrected by using the weighting coefficient, so that the accuracy and reliability of coral coverage calculation are improved.
Drawings
Some specific embodiments of the invention will be described in detail hereinafter, by way of illustration and not limitation, with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
FIG. 1 is a drawing of a sheet and pattern A used in the indoor testing of the present invention;
FIG. 2 is a drawing of a sheet and pattern B used in the indoor testing of the present invention;
FIG. 3 is the accuracy of the validation set at different rounds in the DeepLabV3+ model training process;
FIG. 4 is a real shooting image of a coral reef habitat;
FIG. 5 is a label of DeepLabV3+ model for a coral reef habitat during actual shooting in a test set;
FIG. 6 shows the coral reef habitat prediction results of the DeepLabV3+ model in the test set;
FIG. 7 is a diagram of the DeepLabV3+ model of FIGS. 5 and 6 illustrating different coral species in the test set.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1 to 7, a video analysis test of coral reef habitat survey was as follows:
step one, indoor testing of influence of video motion speed on coverage estimation, comprising the following steps:
(11) and sequentially spreading A4 paper printed with black patterns of the pattern A and the pattern B on the indoor ground to simulate coral reefs distributed on the seabed, wherein the patterns in the first 6 paper sheets are smaller than those in the second 6 paper sheets. A tester holds the camera to shoot a video of the A4 paper in a top view manner at a constant speed as far as possible perpendicular to the ground.
(12) Reading the image frame by frame, carrying out graying and binarization, and removing the influence of the dark texture on the outer surface of the paper by deleting the connected domain with a smaller area. And then extracting the area proportion of the black pattern in each frame of image to be recorded as the coverage of the current frame, and taking the average value of the coverage in all the frames as the total coverage of the area shot by the video.
(13) The result shows that the total video coverage is 0.2588, wherein the black pattern coverage in the first half of the video is 0.1010, and the black pattern coverage in the second half of the video is 0.4080. The total coverage video is close to the average value of the coverage of the first half section and the second half section of the video, and accords with theoretical understanding.
(14) And reducing the time length and the frame number of the first half section of video to 1/2 times of the original time length and frame number, and splicing the first half section of video with the second half section of video to obtain a new test video. The black pattern coverage calculation is performed on the new test video by the method in the step (12) above, the coverage of the new test video is 0.3101, and the result already deviates greatly from the coverage of the original video.
The above tests illustrate that video captured by the camera at different motion speeds has some effect on the coverage estimation.
Step two, carrying out indoor test of correction reliability based on an optical flow method, and comprising the following steps:
(21) and (5) extracting the direction and the amplitude of the characteristic point optical flow vector between every two frames of the video by using the new test video obtained in the step (14) and adopting a sparse optical flow method.
(22) And the direction of the optical flow vector is expressed in a direction from 0 degree to 360 degrees anticlockwise, so that the theoretical direction of the video motion is 90 degrees. The method for calculating the direction of the optical flow vector comprises the steps of dividing an included angle between an optical flow vector between two frames and a horizontal rightward unit vector (1, 0) into 72 sections by taking 5 degrees as an interval width;
(23) and classifying the light stream vectors into different intervals according to the directions of the light stream vectors, and then accumulating the light stream vector amplitudes in the intervals.
(24) And smoothing a one-dimensional vector obtained by the accumulated light stream vector amplitude of the feature points between two frames in each interval by adopting Gaussian filtering.
(25) Taking the maximum value in the optical flow vector accumulated amplitude value within the interval of 50-130 degrees as the motion speed of the corresponding interframe video, and recording the maximum value as V; taking the angle intermediate value of the interval where the maximum value of the accumulated amplitude value is located as the corresponding inter-frame video motion angle, and recording the angle as theta; the product of V sin θ serves as a weighting factor for the corresponding frame of video.
(26) And carrying out weighted correction on the coverage degree solved frame by frame, wherein the calculation method is (0.3101-0.2792)/(0.3101-0.2628), the coverage degree of the black pattern of the video is 0.2792, and the error of the coverage degree is reduced by 65.3%.
The above tests illustrate the effectiveness of the optical flow method for correcting the coverage error caused by the difference in video motion speed. The comparison of the original coverage in step one and the corrected coverage in step two is shown in table 1.
TABLE 1 comparison of values for original and corrected coverages
Video Original coverage Correct rear cover
Original video 0.2588 0.2628
First half segment video 0.1010 -
Second half segment video 0.4080 -
First half frame number halving + second half video 0.3101 0.2792
Step three: establishing and testing a coral reef habitat image semantic segmentation model, and comprising the following steps of:
(31) and screenshot is carried out on the underwater sample belt video in the coral reef area, the screenshot number is larger than 50, preferably 125, all coral types appearing in the video are covered by the screenshot, then labelme software is adopted to label the pictures, and the coral types are identified to be specific by professionals.
(32) The data set is divided into a training set, a verification set and a test set according to the proportion of 7:2:1, the data in the training set and the verification set are trained and verified by a DeepLabV3+ model based on a flying paddle PaddleX deep learning platform, the DeepLabV3+ model adopts a ResNet50 network as a basic network, and a pre-training model on a Pascal VOC data set is adopted for parameter fine adjustment.
(33) The verification shows that the average accuracy of the verification set is 0.7980. And (3) carrying out manual inspection on the prediction effect and the segmentation effect of the trained model on the test set, wherein the manual inspection structure shows that the method can meet the requirement of the overall investigation and evaluation of the coral reef.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (7)

1. A coral reef habitat survey video analysis method is characterized by comprising the following steps:
step 1: video acquisition is carried out on the coral reef habitat;
step 2: training a coral habitat image semantic segmentation model, carrying out frame-by-frame image semantic segmentation on the collected coral reef habitat video, and calculating the coverage of different kinds of coral reefs frame by frame;
and step 3: obtaining the weighting coefficient of the corresponding frame of the video;
and 4, step 4: and estimating the total coverage of the coral reef habitat and the coverage of each coral reef by using the coverage of different kinds of coral reefs and the weighting coefficient of the corresponding video frame.
2. The coral reef habitat survey video analyzing method as claimed in claim 1, wherein said step 1 comprises:
(11) laying sample belts in the coral reef area;
(12) and the diver shoots the coral reef habitat video along the sample belt.
3. The coral reef habitat survey video analyzing method as claimed in claim 2, wherein a length of said sample band is 50 to 100 m.
4. The coral reef habitat survey video analyzing method as claimed in claim 1, wherein said step 2 comprises:
(21) establishing a semantic segmentation data set; training and verifying the semantic segmentation data set by adopting a DeepLabV3+ model so as to generate a coral habitat image semantic segmentation model;
(22) performing semantic segmentation on the coral reef habitat video frame by using the built DeepLabV3+ model to generate a semantic graph of coral reef distribution of each frame;
(23) and calculating the coverage of different coral reefs frame by using the semantic graph.
5. The coral reef habitat survey video analysis method of claim 4 wherein the coverage of said coral reef is a ratio of a number of pixels per coral reef to a total number of pixels in a picture.
6. The coral reef habitat survey video analyzing method as claimed in claim 1, wherein said step 3 comprises:
(31) extracting the direction and the amplitude of an optical flow vector of the characteristic point between every two frames of the video by adopting a sparse optical flow method;
(32) the method for calculating the direction of the optical flow vector of the feature point between two frames comprises the following steps:
dividing an included angle between an optical flow vector of the characteristic point between two frames and a horizontal right unit vector (1, 0) into 72 intervals by taking 5 degrees as an interval width, and classifying the optical flow vector of the characteristic point between two frames into different intervals according to the direction of the optical flow vector;
(33) accumulating the optical flow vector amplitudes in each interval;
(34) smoothing a one-dimensional vector obtained by accumulating the light stream vector amplitudes of the feature points between two frames in each interval by adopting Gaussian filtering;
(35) taking the maximum value in the optical flow vector accumulated amplitude values of the feature points between two frames in the interval of 50-130 degrees as the motion speed of the corresponding video between the frames, and recording the maximum value as V; taking the angle intermediate value of the interval where the maximum value of the optical flow vector accumulated amplitude of the feature points between two frames is located as the corresponding inter-frame video motion angle, and recording the angle intermediate value as theta;
(36) and calculating the weighting coefficient of the corresponding frame of the video as follows:
V·sinθ。
7. the coral reef habitat survey video analyzing method as claimed in claim 1, wherein said step 4 comprises:
(41) multiplying the coverage of the coral reefs of different types in the step 2 by the weighting coefficient of the corresponding video frame in the step 3 to obtain the coverage of the corrected coral reefs in each frame; the original coverage data of the first frame of the video is not processed, and the processing is started from the second frame;
(42) and calculating the average value of the coverage of the corrected coral reefs in each frame to obtain the total coverage of the coral reefs in the whole sample belt and the coverage of each coral reef.
CN202111084090.3A 2021-09-16 2021-09-16 Coral reef habitat survey video analysis method Active CN113537174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111084090.3A CN113537174B (en) 2021-09-16 2021-09-16 Coral reef habitat survey video analysis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111084090.3A CN113537174B (en) 2021-09-16 2021-09-16 Coral reef habitat survey video analysis method

Publications (2)

Publication Number Publication Date
CN113537174A true CN113537174A (en) 2021-10-22
CN113537174B CN113537174B (en) 2021-12-28

Family

ID=78092697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111084090.3A Active CN113537174B (en) 2021-09-16 2021-09-16 Coral reef habitat survey video analysis method

Country Status (1)

Country Link
CN (1) CN113537174B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102608269A (en) * 2012-03-13 2012-07-25 广西红树林研究中心 Coral reef belt transect investigation method
CN104217103A (en) * 2014-08-13 2014-12-17 中国农业科学院植物保护研究所 Method for building and digitally expressing grassland vegetation subtypes
CN106416652A (en) * 2016-07-12 2017-02-22 交通运输部科学研究院 Turf protection utilization method
CN109325431A (en) * 2018-09-12 2019-02-12 内蒙古大学 The detection method and its device of vegetation coverage in Crazing in grassland sheep feeding path
CN109765932A (en) * 2019-01-31 2019-05-17 交通运输部天津水运工程科学研究所 A kind of desert shrubbery cover degree unmanned plane investigation method
CN110059553A (en) * 2019-03-13 2019-07-26 中国科学院遥感与数字地球研究所 The method for knowing potential landslide stage vegetation anomalies feature is sentenced using optical remote sensing image
CN110163924A (en) * 2019-05-09 2019-08-23 海南省海洋与渔业科学院 Coral cover calculation method based on color
CN110220845A (en) * 2019-04-26 2019-09-10 长江流域水环境监测中心 The in-situ monitoring method and device of a kind of raw algae growth band
US20200053988A1 (en) * 2018-08-17 2020-02-20 Andrew MacKay Ross Coral nursery and planting system based on a ring or washer mount
CN110929706A (en) * 2020-02-19 2020-03-27 北京海天瑞声科技股份有限公司 Video frequency selecting method, device and storage medium
CN111637874A (en) * 2020-05-08 2020-09-08 哈尔滨工程大学 Multi-AUV layered detection system and detection method for red tide sea area
CN111771709A (en) * 2020-07-10 2020-10-16 海南省海洋与渔业科学院 Method for restoring seaweed bed in marine ecosystem
CN112906656A (en) * 2021-03-30 2021-06-04 自然资源部第三海洋研究所 Underwater photo coral reef recognition method, system and storage medium
CN112967176A (en) * 2021-02-03 2021-06-15 成都理工大学 Method for analyzing plant coverage by using Image J and Photoshop
CN113160302A (en) * 2021-04-25 2021-07-23 国家海洋局南海环境监测中心(中国海监南海区检验鉴定中心) Coral community analysis method and device

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102608269A (en) * 2012-03-13 2012-07-25 广西红树林研究中心 Coral reef belt transect investigation method
CN104217103A (en) * 2014-08-13 2014-12-17 中国农业科学院植物保护研究所 Method for building and digitally expressing grassland vegetation subtypes
CN106416652A (en) * 2016-07-12 2017-02-22 交通运输部科学研究院 Turf protection utilization method
US20200053988A1 (en) * 2018-08-17 2020-02-20 Andrew MacKay Ross Coral nursery and planting system based on a ring or washer mount
CN109325431A (en) * 2018-09-12 2019-02-12 内蒙古大学 The detection method and its device of vegetation coverage in Crazing in grassland sheep feeding path
CN109765932A (en) * 2019-01-31 2019-05-17 交通运输部天津水运工程科学研究所 A kind of desert shrubbery cover degree unmanned plane investigation method
CN110059553A (en) * 2019-03-13 2019-07-26 中国科学院遥感与数字地球研究所 The method for knowing potential landslide stage vegetation anomalies feature is sentenced using optical remote sensing image
CN110220845A (en) * 2019-04-26 2019-09-10 长江流域水环境监测中心 The in-situ monitoring method and device of a kind of raw algae growth band
CN110163924A (en) * 2019-05-09 2019-08-23 海南省海洋与渔业科学院 Coral cover calculation method based on color
CN110929706A (en) * 2020-02-19 2020-03-27 北京海天瑞声科技股份有限公司 Video frequency selecting method, device and storage medium
CN111637874A (en) * 2020-05-08 2020-09-08 哈尔滨工程大学 Multi-AUV layered detection system and detection method for red tide sea area
CN111771709A (en) * 2020-07-10 2020-10-16 海南省海洋与渔业科学院 Method for restoring seaweed bed in marine ecosystem
CN112967176A (en) * 2021-02-03 2021-06-15 成都理工大学 Method for analyzing plant coverage by using Image J and Photoshop
CN112906656A (en) * 2021-03-30 2021-06-04 自然资源部第三海洋研究所 Underwater photo coral reef recognition method, system and storage medium
CN113160302A (en) * 2021-04-25 2021-07-23 国家海洋局南海环境监测中心(中国海监南海区检验鉴定中心) Coral community analysis method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
AMMAR MAHMOOD等: "Deep Image Representations for Coral Image Classification", 《IEEE JOURNAL OF OCEANIC ENGINEERING》 *
KATSUNORI MIZUNO等: "Development of an Efficient Coral-Coverage Estimation Method Using a Towed Optical Camera Array System [Speedy Sea Scanner (SSS)] and Deep-Learning-Based Segmentation: A Sea Trial at the Kujuku-Shima Islands", 《IEEE JOURNAL OF OCEANIC ENGINEERING》 *
徐屹伟等: "基于简单帧选择的显著性检测方法", 《计算机工程与应用》 *
陈全功等: "遥感技术在草地资源管理上的应用进展", 《国外畜牧学—草原与牧草》 *

Also Published As

Publication number Publication date
CN113537174B (en) 2021-12-28

Similar Documents

Publication Publication Date Title
CN111784633B (en) Insulator defect automatic detection algorithm for electric power inspection video
CN108686978B (en) ARM-based fruit category and color sorting method and system
CN108875821A (en) The training method and device of disaggregated model, mobile terminal, readable storage medium storing program for executing
CN108806334A (en) A kind of intelligent ship personal identification method based on image
CN110751630B (en) Power transmission line foreign matter detection method and device based on deep learning and medium
CN111178197A (en) Mass R-CNN and Soft-NMS fusion based group-fed adherent pig example segmentation method
Atienza-Vanacloig et al. Vision-based discrimination of tuna individuals in grow-out cages through a fish bending model
CN112580612B (en) Physiological signal prediction method
CN109241941A (en) A method of the farm based on deep learning analysis monitors poultry quantity
CN114998314B (en) PCB defect detection method based on computer vision
CN112396635B (en) Multi-target detection method based on multiple devices in complex environment
US20220128358A1 (en) Smart Sensor Based System and Method for Automatic Measurement of Water Level and Water Flow Velocity and Prediction
CN110728269B (en) High-speed rail contact net support pole number plate identification method based on C2 detection data
CN113256602A (en) Unsupervised fan blade defect detection method and system based on self-encoder
CN116703932A (en) CBAM-HRNet model wheat spike grain segmentation and counting method based on convolution attention mechanism
CN116452966A (en) Target detection method, device and equipment for underwater image and storage medium
CN111291818A (en) Non-uniform class sample equalization method for cloud mask
CN117036352B (en) Video analysis method and system based on artificial intelligence
CN113537174B (en) Coral reef habitat survey video analysis method
CN113899349A (en) Sea wave parameter detection method, equipment and storage medium
CN115830514B (en) Whole river reach surface flow velocity calculation method and system suitable for curved river channel
CN117593601A (en) Water gauge tide checking method based on deep learning
CN115272340B (en) Industrial product defect detection method and device
CN112070181A (en) Image stream-based cooperative detection method and device and storage medium
CN112184627A (en) Citrus fresh-keeping quality detection method based on image processing and neural network and application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant