CN108154105B - Underwater biological detection and identification method and device, server and terminal equipment - Google Patents

Underwater biological detection and identification method and device, server and terminal equipment Download PDF

Info

Publication number
CN108154105B
CN108154105B CN201711395749.0A CN201711395749A CN108154105B CN 108154105 B CN108154105 B CN 108154105B CN 201711395749 A CN201711395749 A CN 201711395749A CN 108154105 B CN108154105 B CN 108154105B
Authority
CN
China
Prior art keywords
underwater
neural network
deep neural
organisms
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711395749.0A
Other languages
Chinese (zh)
Other versions
CN108154105A (en
Inventor
乔宇
庄培钦
邢林杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201711395749.0A priority Critical patent/CN108154105B/en
Publication of CN108154105A publication Critical patent/CN108154105A/en
Application granted granted Critical
Publication of CN108154105B publication Critical patent/CN108154105B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The invention is suitable for the technical field of image processing, and provides a method and a device for detecting and identifying underwater creatures, a server and terminal equipment, wherein the method comprises the following steps: acquiring underwater shooting image data; detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data; classifying and counting underwater biological resources based on the identified underwater organisms; and outputting the classification and counting results of the underwater biological resources. The underwater biological detection and identification method can quickly and accurately identify the type of the underwater organisms in the image, reduces manual participation, saves labor and time cost, and improves the accuracy of underwater biological detection and identification.

Description

Underwater biological detection and identification method and device, server and terminal equipment
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to an underwater organism detection and identification method, an underwater organism detection and identification device, a server and terminal equipment.
Background
The current illegal and unordered fishery fishing seriously threatens the marine ecological environment, and particularly the living environment of offshore underwater organisms is continuously deteriorated. In order to preserve the ecological balance of the marine ecological environment, relevant departments and organizations have begun to take corresponding measures to improve the ecological environment of marine organisms. In the process of improving the ecological environment of the offshore underwater organisms, the category identification and the quantity statistics of the offshore underwater organisms are required to be carried out regularly, and the resource statistics of the offshore underwater organism resources is completed. Due to the complexity of underwater creatures, correct judgment is often needed to be made based on the experience of professionals when the acquired underwater creature data is analyzed, a large amount of manpower and time are consumed in the manual underwater creature resource statistics mode, and the underwater creature identification accuracy rate is low.
Disclosure of Invention
In view of the above, the invention provides a method, a device, a server and a terminal device for underwater biological detection and identification, so as to solve the problems that a large amount of manpower and time are consumed in a manner of manually participating in statistics of underwater biological resources and the accuracy of underwater biological identification is low in the prior art.
The invention provides a method for detecting and identifying underwater creatures, which comprises the following steps:
acquiring underwater shooting image data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
The second aspect of the present invention provides an underwater organism detection and identification apparatus, comprising:
the data acquisition unit is used for acquiring underwater shooting image data;
the detection and identification unit is used for detecting the shot image data according to a pre-trained deep neural network model and identifying underwater organisms in the shot image data;
the classification statistical unit is used for classifying and counting the underwater biological resources based on the identified underwater organisms;
and the user interface display unit is used for outputting the classification and counting results of the underwater biological resources.
A third aspect of the invention provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
acquiring underwater shooting image data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
A fourth aspect of the present invention provides a server comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
receiving underwater image shooting data uploaded by terminal equipment;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
A fifth aspect of the present invention provides a terminal device, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
shooting underwater organisms to obtain underwater image shooting data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
The invention has the beneficial effects that:
according to the invention, underwater shooting image data is obtained; detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data; classifying and counting underwater biological resources based on the identified underwater organisms; and outputting the classification and counting results of the underwater biological resources, so that the types of underwater organisms in the image can be rapidly and accurately counted, the manual participation is reduced, the labor and time cost is saved, and the accuracy of underwater organism identification is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic flow chart of an implementation of a method for detecting and identifying underwater creatures provided by an embodiment of the present invention;
FIG. 2 is a schematic flow chart of an implementation of a method for detecting and identifying underwater creatures according to another embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an underwater organism detection and identification apparatus provided by an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an underwater organism detection and identification apparatus provided in another embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Fig. 1 is a schematic flow chart of an implementation of the underwater biological detection and identification method provided by the embodiment of the invention. Referring to fig. 1, the implementation flow of the underwater biological detection and identification method provided by the embodiment is detailed as follows:
and S101, acquiring underwater shooting image data.
It should be noted that, an executing subject of the underwater biological detection and identification method provided in this embodiment may be a server or a terminal device, and when the executing subject is the server, the step S101 specifically includes: receiving underwater shot image data uploaded by terminal equipment; when the execution subject is a terminal device, step S101 specifically includes: and shooting underwater organisms to obtain underwater image shooting data.
In this embodiment, after acquiring the underwater captured image data, noise generated due to uneven distribution of underwater illumination can be removed by using a bottom image processing technique, and the contrast of the captured image data can be adjusted, so that subsequent detection and identification of underwater creatures in the captured image data can be utilized.
And S102, detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data.
In this embodiment, before step S101, the method further includes:
acquiring a training data set of underwater organisms;
and selecting a basic model of the deep neural network, adjusting network parameters of the basic model, and training the training data set to obtain the deep neural network model of the underwater creature.
In this embodiment, the acquiring of the training data set of the underwater creatures includes, but is not limited to, acquiring training data sets of a designated variety of offshore underwater creatures. Specifically, the method comprises the following steps: relevant offshore underwater biological pictures can be collected in advance, the collected pictures are manually classified, screened and the like according to categories, and early preparation work is completed. And acquiring all underwater organism pictures corresponding to each category of underwater organisms according to the data sorting process, taking all the underwater organism pictures corresponding to the underwater organisms as training data corresponding to the underwater organisms, and inputting the training data into the underwater organism detection and identification device.
In this embodiment, after the deep neural network basic model corresponding to each underwater creature is obtained, the network parameters of the basic model need to be correspondingly adjusted according to the requirements of tasks, the characteristics of data, and other aspects. The network parameters include, but are not limited to, learning rate, learning strategy, and batch size.
Preferably, in this embodiment, the network parameters include a learning rate and a batch size. The selection of the learning rate relates to the numerical value change range of each parameter update of the deep neural network, and whether the network can effectively and quickly converge is determined by the suitability of the selection of the learning rate; because the deep neural network adopts the stochastic gradient descent method to approximate the gradient calculation, the randomness of the algorithm is directly influenced by the quantity of data processed each time, and under the condition of the permission of calculation resources, larger batch is set to bring larger randomness for the convergence of the deep neural network, so that the deep neural network can jump out local extreme points during the convergence, and a better network model is obtained. Preferably, in this embodiment, the network parameters of the deep neural network basic model are adjusted by a back propagation algorithm.
In view of the complexity of underwater organism detection and identification, different underwater organisms have differences in various morphological characteristics such as appearance, size and color, and morphological differences of most underwater organisms are often concentrated in local areas and are not easy to distinguish, so that it is important to select a deep network to extract abundant and representative characteristics from an original image. In this embodiment, a Single Shot multi-box Detector (SSD) is selected as a basic model of the deep neural network, and the model is adjusted to meet task requirements of automatic detection and identification of underwater living beings. It should be noted that the basic model of the deep neural network includes the SSD but is not limited to the framework.
The SSD takes a standard convolution layer, a pooling layer and an activation function as a basic framework of a network, the convolution layer generates different output images by completing convolution operation on an input underwater biological image, generates characteristic graphs with different characteristics and simulates the process of characteristic extraction; the pooling layer performs down-sampling operation on the input image according to a certain rule, and reduces redundant parts in image space information; the activation function limits the value range of each element in the characteristic image, and ensures that the value of a result value is within a reasonable range. Based on the basic framework of the deep neural network, the SSD selects the VGG16 network model as a main convolution framework for detecting and identifying the network, and the frame information and the belonging category of each possible object are output at the output part of the network. The SSD includes but is not limited to VGG16 network model, and may be replaced by ResNet or BN-inclusion network model.
Preferably, in this embodiment, step S102 specifically includes:
extracting a multilayer convolution network characteristic diagram of underwater organisms in a training data set according to the pre-trained deep neural network model;
fusing the multilayer convolution characteristic graphs of the underwater organisms to obtain multilayer convolution characteristic fusion graphs of the underwater organisms;
and detecting underwater organisms included in the shot image data according to the multilayer convolution feature fusion graph of the underwater organisms, and identifying the types of the underwater organisms.
In this embodiment, the feature maps of convolution networks of different depths are fused. In the convolution layers with different depths, the adjacent feature maps of the layers are respectively subjected to corresponding up-sampling and down-sampling operations, the resolutions of the different feature maps are unified, and feature fusion is completed. Therefore, the characteristic information acquired by the detection module in the deep neural network can be greatly enriched, and the accuracy of the detection module in outputting the detection frame can be improved. Meanwhile, the strategy of multi-convolution feature fusion can also increase the sensitivity of a single deep convolution neural network to underwater objects with different scales to a certain extent.
And S103, classifying and counting the underwater biological resources based on the identified underwater organisms.
In this embodiment, after identifying the underwater creatures in the captured image data, the underwater creature resource classification and statistics can be performed based on the identification result.
And step S104, outputting the classification and counting results of the underwater biological resources.
In this embodiment, step S104 specifically includes: and displaying the classification and counting results of the underwater biological resources through a display device, and displaying the statistical results of the underwater resources to a user in real time.
Preferably, in this embodiment, a visual interface may be further provided in the whole underwater creature detection and identification process to show the user the relevant information of the target underwater creature in the image, for example: the position coordinates of the target underwater creature in the image, the size of the target individual and the like, so as to improve the interaction friendliness with the user.
As can be seen from the above, the underwater biological detection and identification method provided by the embodiment obtains the underwater shot image data; detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data; classifying and counting underwater biological resources based on the identified underwater organisms; and outputting the classification and counting results of the underwater biological resources, so that the types of underwater organisms in the image can be rapidly and accurately counted, the manual participation is reduced, the labor and time cost is saved, and the accuracy of underwater organism identification is improved.
Fig. 2 is a schematic flow chart illustrating an implementation of the underwater biological detection and identification method according to another embodiment of the present invention. Referring to fig. 2, the underwater biological detection and identification method provided by the embodiment includes the following steps:
step S201, obtaining a training set of the underwater creature, and carrying out multi-scale scaling on an original image in the training data set of the underwater creature to obtain training data sets corresponding to the underwater creature under different scales.
In the present embodiment, the acquiring of the training data sets of various underwater creatures includes, but is not limited to, acquiring training data sets of a specified variety of offshore underwater creatures. Specifically, the method comprises the following steps: relevant offshore underwater biological pictures can be collected in advance, the collected pictures are manually classified, screened and the like according to categories, and early preparation work is completed. Acquiring all underwater organism pictures corresponding to each category of underwater organisms according to the data sorting process, taking 90% of the underwater organism pictures corresponding to the underwater organisms as a training data set corresponding to the underwater organisms, and inputting the training data set to an underwater organism detection and identification device; and meanwhile, the remaining 10% of underwater biological picture data is reserved as verification data for finding the optimal network parameters in the implementation process.
Step S202, selecting a basic model of the deep neural network, adjusting network parameters of the basic model, and training data sets corresponding to the underwater creatures under different scales to obtain the deep neural network models corresponding to the underwater creatures under different scales.
In the embodiment, the underwater organism original image is subjected to multi-scale scaling, the same batch of data is converted into a plurality of batches of images with different scales, and the corresponding deep neural basic network is trained aiming at the images with different scales at the later stage.
Preferably, in this embodiment, after step S202, the method includes:
step S203, test image data is acquired.
In this embodiment, the test image data is underwater biological image data actually acquired in water by using a camera device at a stage of testing the deep neural network obtained by training.
And S204, detecting underwater creatures in the test image data by adopting corresponding deep neural network models under different scales.
In this embodiment, the accuracy of the recognition of the deep neural network model can be evaluated by testing the deep neural network model by using the test image data.
And S205, combining the detection frames output by the corresponding deep neural network models under different scales for each test image, and removing the overlapped detection frames through a non-maximum suppression algorithm.
In the embodiment, the original image of the underwater organism is subjected to multi-scale scaling in the early stage, the corresponding deep neural network is trained according to different data scales in the later stage, different deep neural networks have different recognition capabilities on images with different scales, multi-model fusion is carried out in the testing stage, detection frames of a plurality of models are combined, a non-maximum algorithm is used for removing overlapped detection frames, and a final detection frame is obtained.
Preferably, in this embodiment, the training process on the training data set further includes:
adding network branches on the basis of the basic model of the deep neural network, and carrying out normalization processing on the feature map obtained by deep neural network learning to obtain a significant map;
and performing product operation on the salient map and the corresponding elements of the feature map to obtain the feature map with irrelevant background information removed.
In view of the fact that the underwater biological image is generally mixed with a complex underwater environment background, the significance map is used for removing irrelevant background information, and the effectiveness of deep neural network learning information can be improved.
In this embodiment, on the basis of the basic model of the deep neural network, by adding network analysis, basic components including, but not limited to, a convolutional layer, a pooling layer, an activation function, and the like are added to the branch, and the feature map obtained by learning is normalized by a logic function (sigmoid function), so as to obtain a normalized saliency map. And then, performing product operation on the saliency map obtained after normalization and the feature map of the original image, and completing the separation process of the foreground and the background through a point multiplication algorithm. The significant map generation network uses the basic components to be combined in sequence or for multiple times, corresponding to the segmentation process of the traditional image processing, the foreground is extracted and is subjected to the end-to-end training process of the deep neural network, so that the error accumulation caused by the multi-step segmentation process can be avoided, the reliability of the deep neural network model corresponding to each underwater creature is provided, and the performance of later-stage image recognition is favorably improved.
Since steps S206 to S209 in this embodiment are completely the same as the steps S101 to S104 in the previous embodiment, they are not described herein again.
Compared with the previous embodiment, the underwater biological detection and identification method provided by the embodiment has the advantages that the original image is zoomed in multiple scales, the multiple neural networks are trained simultaneously, the deep neural network has different identification capabilities on images with different scales, and the multiple networks are fused with the detection frames output by the images in the later period, so that the performance of the deep neural network can be improved later; in addition, the salient map network module is used for guiding the key positioning of the foreground content of the underwater biological image and inhibiting background irrelevant information in the image, so that the effective extraction of the features by the deep neural network can be improved, and the accuracy of subsequent underwater biological image identification is further improved.
Fig. 3 is a schematic structural diagram of an underwater organism detection and identification device provided by an embodiment of the invention. Only the portions related to the present embodiment are shown for convenience of explanation.
Referring to fig. 3, the underwater organism detection and identification apparatus 3 provided in the present embodiment includes:
the data acquisition unit 31 is used for acquiring underwater shooting image data;
the detection and identification unit 32 is used for detecting the shot image data according to a pre-trained deep neural network model and identifying underwater organisms in the shot image data;
a classification statistical unit 33 for performing classification and counting of underwater biological resources based on the identified underwater creatures;
and the user interface display unit 34 is used for outputting the classification and counting results of the underwater biological resources.
Optionally, the underwater biological detection and recognition device 3 further includes a first training unit 35, configured to:
acquiring a training data set of underwater organisms;
and selecting a basic model of the deep neural network, adjusting network parameters of the basic model, and training the training data set to obtain the deep neural network model of the underwater creature.
Optionally, referring to fig. 4, the underwater biological detection and recognition apparatus 3 further includes a second training unit 36 for:
acquiring a training data set of an underwater organism, and carrying out multi-scale scaling on an original image in the training data set of the underwater organism to obtain training data sets corresponding to the underwater organism under different scales;
selecting a basic model of the deep neural network, adjusting network parameters of the basic model, and training data sets corresponding to the underwater creatures under different scales to obtain the deep neural network models corresponding to the underwater creatures under different scales.
Optionally, the underwater organism detection and identification device 3 further comprises a test unit 37 for
Acquiring test image data;
detecting underwater organisms in the test image data by adopting corresponding deep neural network models under different scales;
combining the detection frames output by each test image by the corresponding deep neural network models under different scales, and removing the overlapped detection frames by a non-maximum suppression algorithm.
Optionally, the underwater biological detection and identification apparatus 3 further includes a saliency map processing unit 38, configured to:
adding network branches on the basis of the basic model of the deep neural network, and carrying out normalization processing on the feature map obtained by deep neural network learning to obtain a significant map;
and performing product operation on the salient map and the corresponding elements of the feature map to obtain the feature map with irrelevant background information removed.
Optionally, the detecting and identifying unit 32 is specifically configured to:
extracting a multilayer convolution network characteristic diagram of underwater organisms in a training data set according to the pre-trained deep neural network model;
fusing the multilayer convolution characteristic graphs of the underwater organisms to obtain multilayer convolution characteristic fusion graphs of the underwater organisms;
and detecting underwater organisms included in the shot image data according to the multilayer convolution feature fusion graph of the underwater organisms, and identifying the types of the underwater organisms.
It should be noted that, since each unit of the underwater biological detection and identification device provided in the embodiment of the present invention is based on the same concept as that of the embodiment of the method of the present invention, the technical effect thereof is the same as that of the embodiment of the method of the present invention, and specific contents thereof can be referred to the description of the embodiment of the method of the present invention, and are not described herein again. The device can be used on one side of the server to analyze and process underwater shot image data uploaded by the terminal equipment and obtain an underwater resource statistical result; the method can also be used on the side of the terminal equipment, so that the terminal equipment can analyze and process the collected underwater biological shooting image data to obtain an underwater resource statistical result.
Therefore, the underwater creature detection and identification device provided by the embodiment of the invention can quickly and accurately identify the types of the underwater creatures in the image, reduces manual participation, saves labor and time cost, and improves the accuracy of underwater creature identification.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
The method or apparatus described in the above embodiments of the present invention may implement the service logic through a computer program and record the service logic on a storage medium, where the storage medium may be read and executed by a computer, so as to implement the effect of the solution described in the embodiments of the present invention. Accordingly, the present invention also provides a computer readable storage medium having stored thereon computer instructions that when executed perform the steps of:
acquiring underwater shooting image data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
The method or apparatus or computer readable storage medium described above may be used in a server. In a specific embodiment, the server may include a processor and a memory for storing processor-executable instructions, where the processor executes the instructions to implement the following steps:
receiving underwater image shooting data uploaded by terminal equipment;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
The method or the device or the computer-readable storage medium can be used for the terminal device side, and the terminal device side is used for realizing the statistics of the underwater biological resources. Fig. 5 shows a schematic structural diagram of a terminal device provided in an embodiment of the present invention. Referring to fig. 5, the terminal device 5 includes a memory 51, a processor 50, and a computer program 52 stored in the memory 51 and executable on the processor 50, and the processor 50 implements the following steps when executing the computer program 52:
shooting underwater organisms to obtain underwater image shooting data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
Illustratively, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 52 in the underwater biological detection and identification device.
The terminal device 5 may include, but is not limited to, a processor 50 and a memory 51. It will be understood by those skilled in the art that fig. 5 is only an example of the terminal device 5, and does not constitute a limitation to the terminal device 5, and may include more or less components than those shown, or combine some components, or different components, for example, the terminal device 5 may further include an input-output device, a network access device, a bus, etc.
The Processor 50 may be a Graphics Processing Unit (GPU), a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal device 5, such as a hard disk or a memory of a terminal device 5 apparatus. The memory 51 may also be an external storage device of the terminal device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the terminal device 5. The memory 51 is used for storing the computer program and other programs and data required by the terminal. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Although reference is made in this disclosure to the construction, data acquisition, interaction, computation, judgment, etc. of data models, such as deep neural network detection, image recognition and classification of underwater creatures, etc., the present application is not limited to situations that must conform to industry communication standards, standard data models, computer and storage rules, or the embodiments described herein. Certain industry standards, or implementations modified slightly from those described using custom modes or examples, may also achieve the same, equivalent, or similar, or other, contemplated implementations of the above-described examples. The embodiments using the modified or transformed data acquisition, storage, judgment, processing and the like may still fall within the scope of the alternative embodiments of the present application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (6)

1. An underwater biological detection and identification method, comprising:
acquiring a training data set of underwater organisms, specifically: acquiring all underwater organism pictures corresponding to each category of underwater organisms, taking all the underwater organism pictures corresponding to the underwater organisms as training data corresponding to the category of underwater organisms, and carrying out multi-scale scaling on original images in a training data set of the underwater organisms to obtain training data sets corresponding to the underwater organisms under different scales;
selecting a basic model of a deep neural network, adjusting network parameters of the basic model, and training data sets corresponding to the underwater creatures under different scales to obtain deep neural network models corresponding to the underwater creatures under different scales;
adding network branches on the basis of the basic model of the deep neural network, and carrying out normalization processing on the feature map obtained by deep neural network learning to obtain a significant map;
performing product operation on the salient map and the corresponding elements of the feature map to obtain the feature map with irrelevant background information removed;
acquiring test image data;
detecting underwater organisms in the test image data by adopting corresponding deep neural network models under different scales;
combining the detection frames output by each test image by the corresponding deep neural network models under different scales, and removing the overlapped detection frames through a non-maximum suppression algorithm to obtain a final detection frame so as to realize the fusion of the deep neural network models corresponding to different scales;
acquiring underwater shooting image data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
2. The underwater organism detection and identification method of claim 1, wherein the detecting the shot image data according to the pre-trained deep neural network model, and the identifying the underwater organism in the shot image data comprises:
extracting a multilayer convolution network characteristic diagram of underwater organisms in a training data set according to the pre-trained deep neural network model;
fusing the multilayer convolution characteristic graphs of the underwater organisms to obtain multilayer convolution characteristic fusion graphs of the underwater organisms;
and detecting underwater organisms included in the shot image data according to the multilayer convolution feature fusion graph of the underwater organisms, and identifying the types of the underwater organisms.
3. An underwater organism detection and identification apparatus, comprising:
a first training unit to: acquiring a training data set of underwater organisms, specifically: acquiring all underwater organism pictures corresponding to each category of underwater organisms, and taking all the underwater organism pictures corresponding to the underwater organisms as training data corresponding to the underwater organisms; selecting a basic model of a deep neural network, adjusting network parameters of the basic model, and training the training data set to obtain a deep neural network model of the underwater creature;
a second training unit to: acquiring a training data set of an underwater organism, and carrying out multi-scale scaling on an original image in the training data set of the underwater organism to obtain training data sets corresponding to the underwater organism under different scales; selecting a basic model of a deep neural network, adjusting network parameters of the basic model, and training data sets corresponding to the underwater creatures under different scales to obtain deep neural network models corresponding to the underwater creatures under different scales;
the test unit is used for acquiring test image data; detecting underwater organisms in the test image data by adopting corresponding deep neural network models under different scales; combining the detection frames output by each test image by the corresponding deep neural network models under different scales, and removing the overlapped detection frames through a non-maximum suppression algorithm to obtain a final detection frame so as to realize the fusion of the deep neural network models corresponding to different scales;
a saliency map processing unit for: adding network branches on the basis of the basic model of the deep neural network, and carrying out normalization processing on the feature map obtained by deep neural network learning to obtain a significant map; performing product operation on the salient map and the corresponding elements of the feature map to obtain the feature map with irrelevant background information removed;
the data acquisition unit is used for acquiring underwater shooting image data;
the detection and identification unit is used for detecting the shot image data according to a pre-trained deep neural network model and identifying underwater organisms in the shot image data;
the classification statistical unit is used for classifying and counting the underwater biological resources based on the identified underwater organisms;
and the user interface display unit is used for outputting the classification and counting results of the underwater biological resources.
4. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to claim 1 or 2.
5. A server comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of:
acquiring a training data set of underwater organisms, specifically: acquiring all underwater organism pictures corresponding to each category of underwater organisms, taking all the underwater organism pictures corresponding to the underwater organisms as training data corresponding to the category of underwater organisms, and carrying out multi-scale scaling on original images in a training data set of the underwater organisms to obtain training data sets corresponding to the underwater organisms under different scales;
selecting a basic model of a deep neural network, adjusting network parameters of the basic model, and training data sets corresponding to the underwater creatures under different scales to obtain deep neural network models corresponding to the underwater creatures under different scales;
adding network branches on the basis of the basic model of the deep neural network, and carrying out normalization processing on the feature map obtained by deep neural network learning to obtain a significant map;
performing product operation on the salient map and the corresponding elements of the feature map to obtain the feature map with irrelevant background information removed;
acquiring test image data;
detecting underwater organisms in the test image data by adopting corresponding deep neural network models under different scales;
combining the detection frames output by each test image by the corresponding deep neural network models under different scales, and removing the overlapped detection frames through a non-maximum suppression algorithm to obtain a final detection frame so as to realize the fusion of the deep neural network models corresponding to different scales;
receiving underwater shot image data uploaded by terminal equipment;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
acquiring a training data set of underwater organisms, specifically: acquiring all underwater organism pictures corresponding to each category of underwater organisms, taking all the underwater organism pictures corresponding to the underwater organisms as training data corresponding to the category of underwater organisms, and carrying out multi-scale scaling on original images in a training data set of the underwater organisms to obtain training data sets corresponding to the underwater organisms under different scales;
selecting a basic model of a deep neural network, adjusting network parameters of the basic model, and training data sets corresponding to the underwater creatures under different scales to obtain deep neural network models corresponding to the underwater creatures under different scales;
adding network branches on the basis of the basic model of the deep neural network, and carrying out normalization processing on the feature map obtained by deep neural network learning to obtain a significant map;
performing product operation on the salient map and the corresponding elements of the feature map to obtain the feature map with irrelevant background information removed;
acquiring test image data;
detecting underwater organisms in the test image data by adopting corresponding deep neural network models under different scales;
combining the detection frames output by each test image by the corresponding deep neural network models under different scales, and removing the overlapped detection frames through a non-maximum suppression algorithm to obtain a final detection frame so as to realize the fusion of the deep neural network models corresponding to different scales;
shooting underwater organisms to obtain underwater shooting image data;
detecting the shot image data according to a pre-trained deep neural network model, and identifying underwater organisms in the shot image data;
classifying and counting underwater biological resources based on the identified underwater organisms;
and outputting the classification and counting results of the underwater biological resources.
CN201711395749.0A 2017-12-21 2017-12-21 Underwater biological detection and identification method and device, server and terminal equipment Active CN108154105B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711395749.0A CN108154105B (en) 2017-12-21 2017-12-21 Underwater biological detection and identification method and device, server and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711395749.0A CN108154105B (en) 2017-12-21 2017-12-21 Underwater biological detection and identification method and device, server and terminal equipment

Publications (2)

Publication Number Publication Date
CN108154105A CN108154105A (en) 2018-06-12
CN108154105B true CN108154105B (en) 2020-08-07

Family

ID=62464160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711395749.0A Active CN108154105B (en) 2017-12-21 2017-12-21 Underwater biological detection and identification method and device, server and terminal equipment

Country Status (1)

Country Link
CN (1) CN108154105B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11651206B2 (en) * 2018-06-27 2023-05-16 International Business Machines Corporation Multiscale feature representations for object recognition and detection
CN109247914A (en) * 2018-08-29 2019-01-22 百度在线网络技术(北京)有限公司 Illness data capture method and device
WO2020118618A1 (en) * 2018-12-13 2020-06-18 深圳先进技术研究院 Mammary gland mass image recognition method and device
CN109784259B (en) * 2019-01-08 2021-04-13 江河瑞通(北京)技术有限公司 Intelligent water transparency identification method based on image identification and Samsung disk assembly
CN109816671B (en) * 2019-01-31 2021-09-24 深兰科技(上海)有限公司 Target detection method, device and storage medium
CN110009599A (en) * 2019-02-01 2019-07-12 腾讯科技(深圳)有限公司 Liver masses detection method, device, equipment and storage medium
CN110210474B (en) * 2019-04-30 2021-06-01 北京市商汤科技开发有限公司 Target detection method and device, equipment and storage medium
CN110764093A (en) * 2019-09-30 2020-02-07 苏州佳世达电通有限公司 Underwater biological identification system and method thereof
CN110715775A (en) * 2019-11-21 2020-01-21 中国科学院合肥物质科学研究院 Device and method for detecting underwater bubbling of inner container
CN111209961B (en) * 2020-01-03 2020-10-09 广州海洋地质调查局 Method for identifying benthos in cold spring area and processing terminal
CN111652012B (en) * 2020-05-11 2021-10-29 中山大学 Curved surface QR code positioning method based on SSD network model
CN112694967A (en) * 2020-12-22 2021-04-23 华南理工大学 Device for rapidly screening aquatic ecological species on site
CN113112726B (en) * 2021-05-11 2022-12-09 创新奇智(广州)科技有限公司 Intrusion detection method, device, equipment, system and readable storage medium
CN113239809B (en) * 2021-05-14 2023-09-15 西北工业大学 Underwater sound target identification method based on multi-scale sparse SRU classification model
CN113538313B (en) * 2021-07-22 2022-03-25 深圳大学 Polyp segmentation method and device, computer equipment and storage medium
CN115690546B (en) * 2022-12-30 2023-05-05 正大农业科学研究有限公司 Shrimp length measuring method, device, electronic equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102547123B (en) * 2012-01-05 2014-02-26 天津师范大学 Self-adapting sightline tracking system and method based on face recognition technology
CN104367317B (en) * 2014-10-15 2017-04-12 北京理工大学 Electrocardiogram electrocardiosignal classification method with multi-scale characteristics combined
CN104850836B (en) * 2015-05-15 2018-04-10 浙江大学 Insect automatic distinguishing method for image based on depth convolutional neural networks
CN106295541A (en) * 2016-08-03 2017-01-04 乐视控股(北京)有限公司 Vehicle type recognition method and system

Also Published As

Publication number Publication date
CN108154105A (en) 2018-06-12

Similar Documents

Publication Publication Date Title
CN108154105B (en) Underwater biological detection and identification method and device, server and terminal equipment
CN111667520B (en) Registration method and device for infrared image and visible light image and readable storage medium
CN105938559B (en) Use the Digital Image Processing of convolutional neural networks
CN105574513B (en) Character detecting method and device
CN109086811B (en) Multi-label image classification method and device and electronic equipment
CN109886928B (en) Target cell marking method, device, storage medium and terminal equipment
CN108776819A (en) A kind of target identification method, mobile terminal and computer readable storage medium
CN107958230B (en) Facial expression recognition method and device
CN104063686B (en) Crop leaf diseases image interactive diagnostic system and method
CN104866868A (en) Metal coin identification method based on deep neural network and apparatus thereof
CN109117773A (en) A kind of characteristics of image point detecting method, terminal device and storage medium
CN111860398A (en) Remote sensing image target detection method and system and terminal equipment
CN111067522A (en) Brain addiction structural map assessment method and device
CN111126254A (en) Image recognition method, device, equipment and storage medium
CN108764248B (en) Image feature point extraction method and device
CN111353577B (en) Multi-task-based cascade combination model optimization method and device and terminal equipment
CN111597937B (en) Fish gesture recognition method, device, equipment and storage medium
CN109145955A (en) A kind of Wood Identification Method and system
CN108805838A (en) A kind of image processing method, mobile terminal and computer readable storage medium
CN110795993A (en) Method and device for constructing model, terminal equipment and medium
CN114511702A (en) Remote sensing image segmentation method and system based on multi-scale weighted attention
CN113160414A (en) Automatic identification method and device for remaining amount of goods, electronic equipment and computer readable medium
CN114764833A (en) Plant growth curve determination method and device, electronic equipment and medium
CN112183359A (en) Violent content detection method, device and equipment in video
CN109961083A (en) For convolutional neural networks to be applied to the method and image procossing entity of image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant