CN113516046B - Method, device, equipment and storage medium for monitoring biological diversity in area - Google Patents
Method, device, equipment and storage medium for monitoring biological diversity in area Download PDFInfo
- Publication number
- CN113516046B CN113516046B CN202110542276.2A CN202110542276A CN113516046B CN 113516046 B CN113516046 B CN 113516046B CN 202110542276 A CN202110542276 A CN 202110542276A CN 113516046 B CN113516046 B CN 113516046B
- Authority
- CN
- China
- Prior art keywords
- biological
- feature
- organism
- feature extraction
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000012544 monitoring process Methods 0.000 title claims abstract description 25
- 238000000605 extraction Methods 0.000 claims abstract description 64
- 238000012549 training Methods 0.000 claims description 54
- 230000006399 behavior Effects 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 12
- 238000002372 labelling Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 230000006870 function Effects 0.000 claims description 4
- 230000005284 excitation Effects 0.000 claims description 3
- 238000003062 neural network model Methods 0.000 claims description 2
- 230000000306 recurrent effect Effects 0.000 claims description 2
- 238000012806 monitoring device Methods 0.000 claims 1
- 241000282414 Homo sapiens Species 0.000 abstract description 4
- 238000011835 investigation Methods 0.000 abstract description 4
- 230000009286 beneficial effect Effects 0.000 abstract description 3
- 238000007726 management method Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000006978 adaptation Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000003542 behavioural effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000000047 product Substances 0.000 description 3
- 238000012550 audit Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008033 biological extinction Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012954 risk control Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/51—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/71—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7837—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/211—Selection of the most significant subset of features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Library & Information Science (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a method for monitoring biological diversity in a region, which comprises the following steps: acquiring shooting information in the area; extracting respective corresponding feature sets of each organism from the shooting information through different feature extraction models; comparing the characteristics of each characteristic set with a preset biological characteristic library to respectively obtain biological individual information of each organism; comparing each of the biological individual information with a regional biological database recorded in the region; and updating the biological individual information recorded in the regional biological database according to the comparison result. The invention has the beneficial effects that: the method realizes the automatic tracking investigation of the biodiversity, does not need a large amount of human resources, and can investigate the biodiversity in the area more efficiently, accurately and in real time.
Description
Technical Field
The invention relates to the field of artificial intelligence, in particular to a method, a device, equipment and a storage medium for monitoring biological diversity in a region.
Background
At present, the society and economy develop at a high speed, the environmental problem is increasingly prominent, and the destruction of the ecological environment causes the extinction of more and more biological species, and the biodiversity is the basis of survival and development of human beings and is the basic element of human health. Therefore, the method has great research significance for the biodiversity tracking investigation in the area, and most of the current methods for investigating biodiversity are performed by human being in a tracking observation statistics mode, however, the manual statistics mode is used for collecting and counting data, the cost of statistics manpower is high, and the efficiency is low.
Disclosure of Invention
The invention mainly aims to provide a method, a device, equipment and a storage medium for monitoring the biodiversity in an area, and aims to solve the problems that the cost of statistics is relatively high and the efficiency is relatively low due to the fact that data are collected and counted in a manual statistics mode when biodiversity is counted.
The invention provides a method for monitoring biological diversity in a region, which comprises the following steps:
Acquiring shooting information in the area;
Extracting a feature set of each organism from the shooting information through different image feature extraction models;
Comparing the characteristics of each characteristic set with a preset biological characteristic library to respectively obtain biological individual information of each organism; the biological characteristic library is pre-stored with the characteristics of each organism, and the biological individual information is the similarity with each organism in the preset biological characteristic library;
Comparing each of the biological individual information with a regional biological database recorded in the region;
and updating the biological individual information recorded in the regional biological database according to the comparison result.
The invention also provides a device for monitoring the biological diversity in the area, which comprises:
the acquisition module is used for acquiring shooting information in the area;
the extraction module is used for extracting the feature set of each organism from the shooting information through different image feature extraction models;
The characteristic comparison module is used for comparing the characteristics of each characteristic set with a preset biological characteristic library to respectively obtain biological individual information of each organism; the biological characteristic library is pre-stored with the characteristics of each organism, and the biological individual information is the similarity with each organism in the preset biological characteristic library; an information comparison module for comparing each of the biological individual information with a regional biological database recorded in the region;
And the updating module is used for updating the biological individual information recorded in the regional biological database according to the comparison result.
The invention also provides a computer device comprising a memory storing a computer program and a processor implementing the steps of any of the methods described above when the processor executes the computer program.
The invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of any of the preceding claims.
The invention has the beneficial effects that: the corresponding feature set is extracted from the shooting information, then the corresponding feature set is compared with the preset biological feature library, so that different biological individual information is obtained, then the corresponding biological individual information is compared with organisms in the regional biological database, and the regional biological database is updated according to the comparison result, so that the automatic tracking investigation of the biodiversity is realized, a large amount of manpower resources are not needed, and the biodiversity in the region can be investigated more efficiently, accurately and in real time.
Drawings
FIG. 1 is a flow chart of a method for monitoring in-region biodiversity according to an embodiment of the invention;
FIG. 2 is a schematic block diagram of an apparatus for monitoring in-region biodiversity according to an embodiment of the invention;
Fig. 3 is a schematic block diagram of a computer device according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that, in the embodiments of the present invention, all directional indicators (such as up, down, left, right, front, and back) are merely used to explain the relative positional relationship, movement conditions, and the like between the components in a specific posture (as shown in the drawings), if the specific posture is changed, the directional indicators correspondingly change, and the connection may be a direct connection or an indirect connection.
The term "and/or" is herein merely an association relation describing an associated object, meaning that there may be three relations, e.g., a and B, may represent: a exists alone, A and B exist together, and B exists alone.
Furthermore, descriptions such as those referred to as "first," "second," and the like, are provided for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implying an order of magnitude of the indicated technical features in the present disclosure. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present invention.
Referring to fig. 1, the present invention proposes a method for monitoring biological diversity in an area, comprising:
S1: acquiring shooting information in the area;
S2: extracting a feature set of each organism from the shooting information through different image feature extraction models;
S3: comparing the characteristics of each characteristic set with a preset biological characteristic library to respectively obtain biological individual information of each organism; the biological characteristic library is pre-stored with the characteristics of each organism, and the biological individual information is the similarity with each organism in the preset biological characteristic library;
S4: comparing each of the biological individual information with a regional biological database recorded in the region;
S5: and updating the biological individual information recorded in the regional biological database according to the comparison result.
As described in step S1, shooting information in the area is acquired. The area can be a river basin or a park, shooting is carried out by introducing high-definition camera equipment in a mode of acquiring shooting information, and 360-degree panoramic cameras can be installed on the water surface and under water at intervals of a distance in a monitoring range, so that shooting of the pictures and uploading of the videos of animals in the area are monitored in real time. The shooting information can be video information and/or picture information, the shooting mode is not limited, for example, a camera can be set to shoot a picture every 10s and video of the section, and the picture is uploaded to a back-end server through an internet http protocol. The back-end server receives the front-end request, stores the photos and videos transmitted by the front-end in a time-separated mode, and stores the photos and videos in a disk according to date classification. The record information of the rest http requests is stored in a database table, and the information of the current request time, the photo address and the like is recorded so as to facilitate the subsequent analysis and calculation.
As described in step S2 above, feature sets of each living organism are extracted from the photographing information by different image feature extraction models. Each living being has different characteristics, such as behavior characteristics and appearance characteristics, which can be respectively identified by different characteristic models, namely, the behavior characteristics can be identified by the behavior characteristic identification model, and the appearance characteristics can be identified by the appearance characteristic identification model, so that the corresponding characteristic sets of different living beings are obtained, and the living being is judged according to the characteristics.
And (3) comparing the characteristics of each characteristic set with a preset biological characteristic library to obtain biological individual information of each organism. Different characteristics of living beings are recorded in a preset biological characteristic library, so that the acquired characteristic set can be compared with the characteristics in the biological characteristic library, the comparison mode is not limited, for example, the comparison can be performed in a similarity calculation mode, pixel points are weighted and summed, then differences of two values are compared, or differences among the pixel points can be compared, and then difference values are summed, so that the similarity between the characteristic set and each characteristic is obtained, and detailed description is omitted here in detail. The biometric library is an existing worldwide biometric library, and is not merely an area biometric library.
As described in the above steps S4 to S5, each of the biological individual information is compared with the regional biological database described in the region, and the biological individual information described in the regional biological database is updated based on the comparison result. Generally, there are several kinds of statistics before the region, so there is a region biological database, if the currently identified organism is in the region biological database, the identified time can be updated in the region biological database so as to record the organism, and if the currently identified organism is not in the region biological database, the biological individual information of the organism can be added into the region biological database, so that the supplement of the region biological database is completed. If the current living organism cannot be identified, the picture or video may be transmitted to a person for manual identification.
In one embodiment, before the step S2 of extracting the feature set of each living being from the photographed information through different image feature extraction models, the method further includes:
S101: acquiring a plurality of training data sets of different categories; the training data set comprises the same type of biological image and the labeling features of the corresponding organisms of the biological image;
S102: and respectively inputting each training data set into different initial image feature extraction models to obtain trained image feature extraction models corresponding to each type of organism.
As described in the above steps S101-S102, feature extraction in the feature set is achieved, where multiple training data sets of different classes may be obtained, and training accuracy of the model may be increased by using the training data sets of the same class, so that the training data sets of different classes may be obtained according to the classes and input into different initial image feature extraction models, where the initial image feature extraction models may be a behavioral image feature extraction model or an appearance image feature extraction model, so as to obtain trained image feature extraction models corresponding to each class of living beings, and also obtain corresponding features and corresponding labeling features, so that the features and the corresponding labeling features may be stored in a preset biological feature library, so as to facilitate subsequent comparison.
In one embodiment, the step S3 of comparing each feature set with a preset biometric database includes:
S301: comparing the similarity of each target feature in the feature set with the features in the preset biological feature library;
S302: according to the similarity comparison result, weighting and calculating the organisms in the preset biological feature libraries corresponding to the target features of the organisms to obtain comprehensive similarity values corresponding to the organisms;
s303: judging whether the maximum comprehensive similarity value is larger than a preset similarity value or not;
s304: if yes, the biological individual information corresponding to the maximum comprehensive similarity value is recorded as the biological individual information of the organism corresponding to the target feature.
As described in the above steps S301 to S302, the identification of the living beings in the photographed information is realized. The method comprises the steps of comparing the feature set of the living beings with features in a preset biological feature library, wherein the features with different dimensions are different, so that the features with different dimensions are different, multiple-aspect weighting calculation is needed, the weight of each dimension can be preset, for example, the weight of the appearance feature is larger, the appearance feature can be given larger weight, weighting sum calculation is carried out according to the weight, namely, the similarity obtained by corresponding each dimension is multiplied by the corresponding weight and summed, and the comprehensive similarity value corresponding to each living being is obtained.
As described in the above steps S303-S304, after the integrated similarity value is obtained, the living being may be identified directly according to the integrated similarity value, and considering that there may be a certain error in model identification, it is necessary to further determine the integrated similarity value, that is, determine whether the integrated similarity value with the largest integrated similarity value is greater than the preset similarity value. If the comprehensive similarity value with the maximum similarity is not larger than the preset similarity value, the model is not capable of identifying the living being and is required to be sent to related personnel for manual identification.
In one embodiment, the step S2 of extracting the feature set of each living being from the photographed information by using different image feature extraction models includes:
S201: and inputting the shooting information into each trained image feature extraction model to respectively obtain respective corresponding feature sets of different organisms contained in the shooting information.
As described in the above step S201, the shooting information may be input into each of the trained image extraction models, so as to obtain the feature set corresponding to each living being, as described above, each of the initial image feature extraction models is trained with the training data set corresponding to the living being of different types, however, in one picture or one video, generally, there is more than one living being, so that the shooting information is input into each of the trained image extraction models, so as to obtain the biological features of the training data types corresponding to the image extraction models, respectively, so as to obtain the feature set of each living being.
In one embodiment, before the step S302 of weighting and calculating the target feature of the living being corresponding to the living being in each preset biological feature library according to the similarity comparison result, the method further includes:
S3011: acquiring characteristic Gaussian distribution of each organism in the biological characteristic library;
S3012: according to the formula Calculating a feature correlation value between a t feature and a feature Gaussian distribution of an i-th organism in the feature set; wherein Cv (i, t) represents the feature correlation value, p j(xt) represents a probability value corresponding to the t feature in a feature gaussian distribution of the j-th organism, w j represents a weight value corresponding to a very high gaussian distribution of the j-th organism, and M represents the total number of organisms;
s3013: and scaling the characteristic related value according to the value size of the characteristic related value to obtain the weight of the characteristic corresponding to each organism.
As described in step S3011, different characteristics of the living beings are generally collected when the living beings are constructed, so that the occurrence number of each characteristic of the living beings can be obtained, and the characteristic gaussian distribution of each living beings can be obtained according to the occurrence number.
As described in the above step S3012, the formula is calculated by dividing the sum of the correlation values of the features by the correlation value corresponding to the features, that is, the higher the number of occurrences, the smaller the feature correlation value corresponding to the feature, and the lower the number of occurrences, the larger the feature correlation value corresponding to the feature, which is understood to indicate that a feature is a unique feature of the living being, and therefore the weight to be given should be the greatest, and thus the larger the corresponding feature correlation value.
As described in step S3013, the values of the obtained feature correlation values may have large differences, so that scaling is required, preferably, all feature correlation values are summed, and then each feature correlation value is divided by all feature correlation values, that is, the weights corresponding to each feature correlation value are obtained, or normalization processing may be performed.
In one embodiment, the step S5 of updating the biometric individual information recorded in the regional biometric database according to the comparison result includes:
S501: acquiring the time difference from the last updated time to the current time of the biological individual information of each organism in the regional biological database;
s502: judging whether the time difference exceeds a preset time length or not;
s503: and if the preset time length is exceeded, removing the living being from the regional living being database.
As described in the above steps S501 to S503, updating of the biological data in the regional biological database is achieved. The time difference from the last updated time to the current time of each living being in the regional living being database is obtained, namely, the time when the living being is not shot by the camera is not obtained, if the time difference exceeds the preset time length, the living being can be considered to be absent in the region, and the living being can be removed from the regional living being database in the region. In this embodiment, the identified living being is recorded first, then the time difference is determined, and if the living being is observed, the last updated time is the current time. Thereby realizing the real-time update of the biological data in the regional biological database of the region.
In one embodiment, the initial image feature extraction model comprises a recurrent neural network model comprising: an input layer, a hidden layer and an output layer;
Input layer: data inputs of different types in the feature data for defining the element;
hidden layer: the device is used for carrying out nonlinear processing on characteristic data of elements input by an input layer by utilizing an excitation function;
Output layer: the data type is used for outputting and representing the result of hidden layer fitting and outputting the data type corresponding to the characteristics of the element;
A memory unit: the memory unit decides whether or not to write or delete the memory of the information in the neuron, and combines the characteristic data of the previously recorded element, the characteristic data of the currently memorized element and the characteristic of the currently inputted element together to record the long-term information.
The step S102 of inputting each training data set into different initial image feature extraction models to obtain trained image feature extraction models corresponding to each type of living beings includes:
S1021: respectively inputting each training data set to the input layer of the corresponding initial image feature extraction model;
S1022: carrying out nonlinear processing on each training data set input by the input layer by using an excitation function through a hidden layer to obtain a fitting result;
S1023: outputting and representing the fitting result through an output layer, and outputting an output result corresponding to each training data set;
s1024: and obtaining the trained image feature extraction model after iterative training.
As described in the above steps S1021-S1024, when training the initial image feature extraction model by using the neural network method stops training, the parameter value after the current training may be obtained. When the living things in the shooting information are extracted later, only parameter values in the image feature extraction model are required to be changed, so that a plurality of models are not required to be established, and space is not occupied. Because the number of features in the data can far exceed the number of training data in many cases, in order to simplify the training of the model, the invention uses a BP neural network-based method to perform feature selection from feature extractor parameters, retrains the reconstructed image feature extraction model according to the labeling features of each sample data and the original features corresponding to each sample data until iteration is terminated, and obtains a trained image feature extraction model. Combining the labeling features of each sample data and the original features of each sample data to obtain combined features of each sample data; screening important features of each sample data from the combined features of each sample data by using an importance method of random forest variables; and retraining the reconstructed image feature extraction model by utilizing the important features of each sample data in the training data until iteration is terminated, and obtaining a trained image feature extraction model.
In one embodiment, before the step of extracting the feature set of each living being from the photographed information by different image feature extraction models, the training step of the behavior feature extraction model includes:
s211: acquiring a plurality of video information containing living things from the shooting information; wherein the plurality of video information comprises a group of main video data and at least one group of auxiliary video data;
S212: transmitting the main video data to a first generation reactance network for training to obtain a first parameter;
s213: inputting the first parameters into a behavior feature extraction model to be trained to obtain an intermediate model;
S214: and inputting the plurality of video information into the intermediate model for three-dimensional data training to obtain a pre-trained behavior feature extraction model.
As described in step S211, in order to acquire the behavioral characteristics of the living being during the regional monitoring, a plurality of cameras are generally required to acquire from different angles, and the shooting information shot by one of the cameras may be used as main video data, and the remaining shooting information may be used as auxiliary video data.
As described in step S212, the countermeasure training is performed through the generating network to be trained and the discriminating network to be trained, that is, preset points (generally joints of living beings) of living beings in each frame of pictures of the obtained main video data are marked first, and are used as the first output result, so that the output result of the generating countermeasure network can be close to the first output result, and the first parameter in the first generating countermeasure network after the training is obtained. Specifically, the training mode is to perform countermeasure training on a generated countermeasure network to be trained and a discrimination network to be trained by using each piece of main video data, and after the countermeasure training reaches a convergence condition, the training on the generated countermeasure network is completed, so that a first parameter is obtained. The generation of the countermeasure Network (GENERATIVE ADVERSARIAL Network [1], abbreviated to GAN) to be trained is a method of unsupervised learning, and the principle is that training is performed by a mode that two neural networks game each other. Only main video data is input, so that the acquired first parameter is two-dimensional data, and an intermediate model is constructed on the basis of the acquired first parameter so as to facilitate subsequent three-dimensional modeling.
As described in step S213, the first parameter is input into the behavioral characteristic extraction model to be trained to obtain an intermediate model. The two-dimensional data of the behavior feature extraction model to be trained about the main video does not need to be trained continuously, and only three-dimensional data training is needed on the basis, namely, an intermediate model is built, so that training time is shortened.
And as described in the step S214, the plurality of video information is input into the intermediate model for three-dimensional data training, so as to obtain a pre-trained behavior feature extraction model. The training method comprises the steps of firstly obtaining a biological three-dimensional model according to main video data and auxiliary video data, marking preset points in the three-dimensional model, then using the marked points as a second output result, and comparing the second output result with a result obtained by directly inputting video information into an intermediate model, so that countermeasure training is carried out, and the output result of the intermediate model is similar to the second output result, thereby completing training of a behavior feature extraction model. It should be noted that, the behavior feature extraction model also generates a model for performing countermeasure training on the network and the discrimination network to be trained.
Referring to fig. 2, the present invention also provides a device for monitoring in-region biodiversity, comprising:
an acquisition module 10, configured to acquire shooting information in the area;
an extraction module 20 for extracting a feature set of each living being from the photographing information through different image feature extraction models; the biological characteristic library is pre-stored with the characteristics of each organism, and the biological individual information is the similarity with each organism in the preset biological characteristic library;
the feature comparison module 30 is configured to perform feature comparison on each feature set and a preset biological feature library, so as to obtain biological individual information of each organism;
an information comparison module 40 for comparing each of the biological individual information with a regional biological database described in the region;
And an updating module 50 for updating the biological individual information recorded in the regional biological database according to the comparison result.
The invention has the beneficial effects that: the corresponding feature set is extracted from the shooting information, then the corresponding feature set is compared with the preset biological feature library, so that different biological individual information is obtained, then the corresponding biological individual information is compared with organisms in the regional biological database, and the regional biological database is updated according to the comparison result, so that the automatic tracking investigation of the biodiversity is realized, a large amount of manpower resources are not needed, and the biodiversity in the region can be investigated more efficiently, accurately and in real time.
Referring to fig. 3, in an embodiment of the present application, there is further provided a computer device, which may be a server, and an internal structure thereof may be as shown in fig. 3. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the computer is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used to store various biological features and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, may implement the method for monitoring in-region biodiversity according to any of the embodiments described above.
It will be appreciated by those skilled in the art that the architecture shown in fig. 3 is merely a block diagram of a portion of the architecture in connection with the present inventive arrangements and is not intended to limit the computer devices to which the present inventive arrangements are applicable.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor can implement the method for monitoring the biological diversity in the area according to any of the above embodiments.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by hardware associated with a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided by the present application and used in embodiments may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual speed data rate SDRAM (SSRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that comprises the element.
Blockchains are novel application modes of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The blockchain (Blockchain), essentially a de-centralized database, is a string of data blocks that are generated in association using cryptographic methods, each of which contains information from a batch of network transactions for verifying the validity (anti-counterfeit) of its information and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The blockchain underlying platform may include processing modules for user management, basic services, smart contracts, operation monitoring, and the like. The user management module is responsible for identity information management of all blockchain participants, including maintenance of public and private key generation (account management), key management, maintenance of corresponding relation between the real identity of the user and the blockchain address (authority management) and the like, and under the condition of authorization, supervision and audit of transaction conditions of certain real identities, and provision of rule configuration (wind control audit) of risk control; the basic service module is deployed on all block chain node devices, is used for verifying the validity of a service request, recording the service request on a storage after the effective request is identified, for a new service request, the basic service firstly analyzes interface adaptation and authenticates the interface adaptation, encrypts service information (identification management) through an identification algorithm, and transmits the encrypted service information to a shared account book (network communication) in a complete and consistent manner, and records and stores the service information; the intelligent contract module is responsible for registering and issuing contracts, triggering contracts and executing contracts, a developer can define contract logic through a certain programming language, issue the contract logic to a blockchain (contract registering), invoke keys or other event triggering execution according to the logic of contract clauses to complete the contract logic, and simultaneously provide a function of registering contract upgrading; the operation monitoring module is mainly responsible for deployment in the product release process, modification of configuration, contract setting, cloud adaptation and visual output of real-time states in product operation, for example: alarms, monitoring network conditions, monitoring node device health status, etc.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.
Claims (9)
1. A method for monitoring biodiversity in a region, comprising:
Acquiring shooting information in the area;
extracting a feature set of each organism from the shooting information through different image feature extraction models; the feature set includes a plurality of features of the living being;
comparing the characteristics of each characteristic set with a preset biological characteristic library to respectively obtain biological individual information of each organism; the biological characteristic library is pre-stored with the characteristics of each organism, and the biological individual information is the similarity with each organism in the preset biological characteristic library; the step of comparing each feature set with a preset biological feature library comprises the following steps:
Comparing the similarity of each target feature in the feature set with the features in the preset biological feature library;
According to the similarity comparison result, weighting and calculating the organisms in the preset biological feature libraries corresponding to the target features of the organisms to obtain comprehensive similarity values corresponding to the organisms;
judging whether the maximum comprehensive similarity value is larger than a preset similarity value or not;
if yes, the biological individual information corresponding to the maximum comprehensive similarity value is recorded as the biological individual information of the organism corresponding to the target characteristic;
Comparing each of the biological individual information with a regional biological database recorded in the region;
Updating the biological individual information recorded in the regional biological database according to the comparison result;
Before the step of extracting the feature set of each organism from the shooting information through different image feature extraction models, the training step of the behavior feature extraction models comprises the following steps:
acquiring a plurality of video information containing living things from the shooting information; wherein the plurality of video information comprises a group of main video data and at least one group of auxiliary video data;
transmitting the main video data to a first generation reactance network for training to obtain a first parameter;
Inputting the first parameters into a behavior feature extraction model to be trained to obtain an intermediate model;
And inputting the plurality of video information into the intermediate model for three-dimensional data training to obtain a pre-trained behavior feature extraction model.
2. The method for monitoring the biological diversity in the area according to claim 1, wherein before the step of extracting the feature set of each living being from the photographed information by using different image feature extraction models, further comprising:
Acquiring a plurality of training data sets of different categories; the training data set comprises the same type of biological image and the labeling features of the corresponding organisms of the biological image;
And respectively inputting each training data set into different initial image feature extraction models to obtain trained image feature extraction models corresponding to each type of organism.
3. The method for monitoring biological diversity in a region according to claim 1, wherein before the step of weighting and calculating the target feature of the living being corresponding to the living being in each of the predetermined biological feature libraries according to the similarity comparison result, the method further comprises:
Acquiring characteristic Gaussian distribution of each organism in the biological characteristic library;
According to the formula Calculating a feature correlation value between a t feature and a feature Gaussian distribution of an i-th organism in the feature set; wherein Cv (i, t) represents the feature correlation value, p j(xt) represents a probability value corresponding to the t feature in a feature gaussian distribution of the j-th organism, w j represents a weight value corresponding to a very high gaussian distribution of the j-th organism, and M represents the total number of organisms;
and scaling the characteristic related value according to the value size of the characteristic related value to obtain the weight of the characteristic corresponding to each organism.
4. The method for monitoring the biological diversity in the area according to claim 1, wherein the step of updating the biological individual information recorded in the area biological database according to the comparison result comprises the steps of:
Acquiring the time difference from the last updated time to the current time of the biological individual information of each organism in the regional biological database;
Judging whether the time difference exceeds a preset time length or not;
And if the preset time length is exceeded, removing the living being from the regional living being database.
5. The method of monitoring biological diversity in a region of claim 2, wherein the initial image feature extraction model comprises a recurrent neural network model comprising: an input layer, a hidden layer and an output layer;
the step of inputting each training data set into different initial image feature extraction models to obtain trained image feature extraction models corresponding to each type of organism comprises the following steps:
respectively inputting each training data set to the input layer of the corresponding initial image feature extraction model;
Carrying out nonlinear processing on each training data set input by the input layer by using an excitation function through a hidden layer to obtain a fitting result;
Outputting and representing the fitting result through an output layer, and outputting an output result corresponding to each training data set;
and obtaining the trained image feature extraction model after iterative training.
6. A device for monitoring biodiversity in an area, comprising:
the acquisition module is used for acquiring shooting information in the area;
the extraction module is used for extracting the feature set of each organism from the shooting information through different image feature extraction models;
the characteristic comparison module is used for comparing the characteristics of each characteristic set with a preset biological characteristic library to respectively obtain biological individual information of each organism; the biological characteristic library is pre-stored with the characteristics of each organism, and the biological individual information is the similarity with each organism in the preset biological characteristic library; the step of comparing each feature set with a preset biological feature library comprises the following steps:
Comparing the similarity of each target feature in the feature set with the features in the preset biological feature library;
According to the similarity comparison result, weighting and calculating the organisms in the preset biological feature libraries corresponding to the target features of the organisms to obtain comprehensive similarity values corresponding to the organisms;
judging whether the maximum comprehensive similarity value is larger than a preset similarity value or not;
if yes, the biological individual information corresponding to the maximum comprehensive similarity value is recorded as the biological individual information of the organism corresponding to the target characteristic;
An information comparison module for comparing each of the biological individual information with a regional biological database recorded in the region;
the updating module is used for updating the biological individual information recorded in the regional biological database according to the comparison result;
Before the step of extracting the feature set of each organism from the shooting information through different image feature extraction models, the training step of the behavior feature extraction models comprises the following steps:
acquiring a plurality of video information containing living things from the shooting information; wherein the plurality of video information comprises a group of main video data and at least one group of auxiliary video data;
transmitting the main video data to a first generation reactance network for training to obtain a first parameter;
Inputting the first parameters into a behavior feature extraction model to be trained to obtain an intermediate model;
And inputting the plurality of video information into the intermediate model for three-dimensional data training to obtain a pre-trained behavior feature extraction model.
7. The in-region biodiversity monitoring device of claim 6, wherein the device further comprises:
The data set acquisition module is used for acquiring a plurality of training data sets of different categories; the training data set comprises the same type of biological image and the labeling features of the corresponding organisms of the biological image;
The data set input module is used for respectively inputting each training data set into different initial image feature extraction models to obtain trained image feature extraction models corresponding to various types of organisms.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110542276.2A CN113516046B (en) | 2021-05-18 | 2021-05-18 | Method, device, equipment and storage medium for monitoring biological diversity in area |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110542276.2A CN113516046B (en) | 2021-05-18 | 2021-05-18 | Method, device, equipment and storage medium for monitoring biological diversity in area |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113516046A CN113516046A (en) | 2021-10-19 |
CN113516046B true CN113516046B (en) | 2024-06-18 |
Family
ID=78064546
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110542276.2A Active CN113516046B (en) | 2021-05-18 | 2021-05-18 | Method, device, equipment and storage medium for monitoring biological diversity in area |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113516046B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114399415B (en) * | 2021-12-23 | 2023-04-25 | 广东贝源检测技术股份有限公司 | Biological diversity investigation and evaluation system |
CN114267015B (en) * | 2021-12-24 | 2022-09-09 | 广东蓝鲲海洋科技有限公司 | Intelligent detection method for ocean abnormal area |
CN118537895A (en) * | 2024-07-22 | 2024-08-23 | 百鸟数据科技(北京)有限责任公司 | Method and system for monitoring biodiversity of wetland continent beach |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111401326A (en) * | 2020-04-21 | 2020-07-10 | 招商局金融科技有限公司 | Target identity recognition method based on picture recognition, server and storage medium |
CN111523479A (en) * | 2020-04-24 | 2020-08-11 | 中国农业科学院农业信息研究所 | Biological feature recognition method and device for animal, computer equipment and storage medium |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108399225A (en) * | 2018-02-12 | 2018-08-14 | 安徽千云度信息技术有限公司 | A kind of information analysis method based on big data |
CN110110707A (en) * | 2019-05-24 | 2019-08-09 | 苏州闪驰数控系统集成有限公司 | Artificial intelligence CNN, LSTM neural network dynamic identifying system |
CN110569721B (en) * | 2019-08-01 | 2023-08-29 | 平安科技(深圳)有限公司 | Recognition model training method, image recognition method, device, equipment and medium |
CN112668365A (en) * | 2019-10-15 | 2021-04-16 | 顺丰科技有限公司 | Material warehousing identification method, device, equipment and storage medium |
CN111695971B (en) * | 2020-06-12 | 2024-01-12 | 腾讯科技(深圳)有限公司 | Article recommendation method, apparatus and device, and computer storage medium |
CN112396005A (en) * | 2020-11-23 | 2021-02-23 | 平安科技(深圳)有限公司 | Biological characteristic image recognition method and device, electronic equipment and readable storage medium |
-
2021
- 2021-05-18 CN CN202110542276.2A patent/CN113516046B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111401326A (en) * | 2020-04-21 | 2020-07-10 | 招商局金融科技有限公司 | Target identity recognition method based on picture recognition, server and storage medium |
CN111523479A (en) * | 2020-04-24 | 2020-08-11 | 中国农业科学院农业信息研究所 | Biological feature recognition method and device for animal, computer equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113516046A (en) | 2021-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113516046B (en) | Method, device, equipment and storage medium for monitoring biological diversity in area | |
CN112613501A (en) | Information auditing classification model construction method and information auditing method | |
CN109816200B (en) | Task pushing method, device, computer equipment and storage medium | |
CN112446310B (en) | Age identification system, method and device based on block chain | |
CN111242948B (en) | Image processing method, image processing device, model training method, model training device, image processing equipment and storage medium | |
Rai et al. | Recognition of Different Bird Category Using Image Processing. | |
CN113034044A (en) | Interviewing method, device, equipment and medium based on artificial intelligence | |
CN111368911B (en) | Image classification method and device and computer readable storage medium | |
CN112949468A (en) | Face recognition method and device, computer equipment and storage medium | |
CN115050064A (en) | Face living body detection method, device, equipment and medium | |
CN111651731A (en) | Method for converting entity product into digital asset and storing same on block chain | |
CN112329629A (en) | Evaluation method and device for online training, computer equipment and storage medium | |
Oraño et al. | Jackfruit fruit damage classification using convolutional neural network | |
CN114707589B (en) | Method, apparatus, storage medium, device and program product for generating challenge sample | |
CN113283388B (en) | Training method, device, equipment and storage medium of living body face detection model | |
CN114492827A (en) | Block chain technology-based federated learning model watermark reinforcement method and application | |
CN111695544B (en) | Information sending method and device based on crowd detection model and computer equipment | |
CN116205726B (en) | Loan risk prediction method and device, electronic equipment and storage medium | |
CN112966787B (en) | Method, device, computer equipment and storage medium for identifying similar patients | |
CN116958846A (en) | Video detection method, device, equipment, medium and product | |
CN112766407B (en) | Image recognition method, device and storage medium | |
CN114863430A (en) | Automatic population information error correction method, device and storage medium thereof | |
CN114462073A (en) | De-identification effect evaluation method and device, storage medium and product | |
CN113821498A (en) | Data screening method, device, equipment and medium | |
CN111291597B (en) | Crowd situation analysis method, device, equipment and system based on image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |