CN111709374A - Bird condition detection method and device, computer equipment and storage medium - Google Patents

Bird condition detection method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111709374A
CN111709374A CN202010558302.6A CN202010558302A CN111709374A CN 111709374 A CN111709374 A CN 111709374A CN 202010558302 A CN202010558302 A CN 202010558302A CN 111709374 A CN111709374 A CN 111709374A
Authority
CN
China
Prior art keywords
bird
condition detection
image
training
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010558302.6A
Other languages
Chinese (zh)
Other versions
CN111709374B (en
Inventor
廖金辉
李德民
吴亦歌
肖娟
贺鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Minghang Technology Development Co ltd
Shenzhen Sunwin Intelligent Co Ltd
Original Assignee
Shanghai Minghang Technology Development Co ltd
Shenzhen Sunwin Intelligent Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Minghang Technology Development Co ltd, Shenzhen Sunwin Intelligent Co Ltd filed Critical Shanghai Minghang Technology Development Co ltd
Priority to CN202010558302.6A priority Critical patent/CN111709374B/en
Publication of CN111709374A publication Critical patent/CN111709374A/en
Application granted granted Critical
Publication of CN111709374B publication Critical patent/CN111709374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The invention relates to a bird condition detection method, a bird condition detection device, computer equipment and a storage medium, wherein the method comprises the steps of acquiring a real-time image to obtain an image to be detected; inputting the image to be detected into a bird condition detection model for bird condition detection to obtain a detection result; performing bird condition analysis according to the detection result to obtain an analysis result; sending a driving notice to the bird repelling device according to the analysis result so as to enable the bird repelling device to perform bird repelling operation; the bird condition detection model is obtained by training a deep learning neural network by taking a plurality of bird images with bird position labels and grade labels representing the number of birds as sample data. The invention can estimate the number of bird groups approximately and detect the bird condition of the current situation accurately, so as to improve the driving effect.

Description

Bird condition detection method and device, computer equipment and storage medium
Technical Field
The present invention relates to a bird condition detection method, and more particularly, to a bird condition detection method, apparatus, computer device, and storage medium.
Background
In some special occasions, such as airports, disasters can be caused due to the existence of birds, and in the special occasions, some common basic bird repelling devices, such as ultrasonic, gas gun, laser and the like, are equipped, but the driving modes of the devices comprise the traditional timing driving or manual driving.
The existing solution is to detect the bird condition to drive birds pertinently, and the current method of detecting birds by adopting monitoring equipment is to count the number of overall detection birds by detecting a single bird through accumulation, so that if a plurality of birds are overlapped or blocked mutually between the birds once, the birds cannot be detected, or the number of the detected birds is inaccurate, the bird condition is misjudged, and the driving effect is influenced.
Therefore, it is necessary to design a new method for estimating the number of bird groups approximately and detecting the bird condition of the current situation accurately so as to improve the driving effect.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a bird condition detection method, a bird condition detection device, computer equipment and a storage medium.
In order to achieve the purpose, the invention adopts the following technical scheme: a bird condition detection method comprising:
acquiring a real-time image to obtain an image to be detected;
inputting the image to be detected into a bird condition detection model for bird condition detection to obtain a detection result;
performing bird condition analysis according to the detection result to obtain an analysis result;
sending a driving notice to the bird repelling equipment according to the analysis result so as to enable the bird repelling equipment to perform bird repelling operation;
the bird condition detection model is obtained by training a deep learning neural network by taking a plurality of bird images with bird position labels and grade labels representing the number of birds as sample data.
The further technical scheme is as follows: the bird condition detection model is obtained by training a deep learning neural network by using a plurality of bird images with bird position tags and grade tags representing the number of birds as sample data sets, and comprises the following steps:
acquiring an image of the bird;
marking a bird position label and a grade label representing the number of birds on a bird image to obtain a sample data set;
and training the deep learning neural network by adopting a sample data set to obtain a bird condition detection model.
The further technical scheme is as follows: the training of the deep learning neural network by adopting the sample data set to obtain the bird condition detection model comprises the following steps:
dividing a sample data set into a training set and a test set;
setting parameters for training the YOLOV4 algorithm;
inputting the training set into a YOLOV4 algorithm to train a network model so as to obtain an initial model;
testing the initial model by adopting a test set to obtain a test result;
judging whether the test result meets the requirement or not;
if the test result does not meet the requirement, executing the parameter for setting the YOLOV4 algorithm training;
and if the test result meets the requirement, taking the initial model as a bird condition detection model.
The further technical scheme is as follows: the detection result comprises the grade corresponding to the number of the bird groups and the position information of the bird groups.
The further technical scheme is as follows: performing bird condition analysis according to the detection result to obtain an analysis result, comprising:
intercepting a real-time image according to the detection result to obtain an intermediate image;
inputting the intermediate image into a bird recognition model for bird recognition to obtain a category;
establishing a corresponding bird condition database according to the categories;
and analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
The further technical scheme is as follows: the bird recognition model is obtained by training a convolutional neural network by using a plurality of images with bird category labels as sample data.
The invention also provides a bird condition detection device, comprising:
the real-time image acquisition unit is used for acquiring a real-time image to obtain an image to be detected;
the detection unit is used for inputting the image to be detected into the bird condition detection model for bird condition detection to obtain a detection result;
the analysis unit is used for carrying out bird condition analysis according to the detection result so as to obtain an analysis result;
and the sending unit is used for sending a driving notice to the bird repelling device according to the analysis result so as to ensure that the bird repelling device carries out bird repelling operation.
The further technical scheme is as follows: the analysis unit includes:
the intercepting subunit is used for intercepting the real-time image according to the detection result so as to obtain an intermediate image;
the identification subunit is used for inputting the intermediate image into a bird identification model for bird identification to obtain a category;
the establishing subunit is used for establishing a corresponding bird condition database according to the categories;
and the environment analysis subunit is used for analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory is stored with a computer program, and the processor realizes the method when executing the computer program.
The invention also provides a storage medium storing a computer program which, when executed by a processor, is operable to carry out the method as described above.
Compared with the prior art, the invention has the beneficial effects that: the bird condition detection model can detect single birds, multiple birds and overlapped bird groups, estimate the number of the birds in the whole real-time image, analyze the bird conditions according to the obtained detection result, and perform corresponding bird repelling operation according to the bird condition analysis result, so that the purpose of roughly estimating the number of the bird groups is achieved, the bird condition under the current condition is accurately detected, and the driving effect is improved.
The invention is further described below with reference to the accompanying drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a bird condition detection method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a bird condition detection method according to an embodiment of the present invention;
fig. 3 is a schematic sub-flow chart of a bird condition detection method according to an embodiment of the present invention;
fig. 4 is a schematic sub-flow chart of a bird condition detection method according to an embodiment of the present invention;
fig. 5 is a schematic sub-flow chart of a bird condition detection method according to an embodiment of the present invention;
fig. 6 is a schematic block diagram of a bird condition detection apparatus according to an embodiment of the present invention;
fig. 7 is a schematic block diagram of an analysis unit of the bird condition detection apparatus according to the embodiment of the present invention;
FIG. 8 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic view of an application scenario of the bird condition detection method according to the embodiment of the present invention. Fig. 2 is a schematic flowchart of a bird condition detection method according to an embodiment of the present invention. The bird condition detection method is applied to a server. The server performs data interaction with the camera equipment and the bird repelling equipment, acquires a real-time image from the camera equipment, performs detection by using a trained bird condition detection model, performs analysis according to a detection result, and drives the corresponding bird repelling equipment to perform bird repelling operation according to an analysis result.
Fig. 2 is a schematic flow chart of a bird condition detection method according to an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S140.
And S110, acquiring a real-time image to obtain an image to be detected.
In the present embodiment, the image to be detected refers to an image of a certain area captured by the imaging apparatus.
S120, inputting the image to be detected into the bird condition detection model for bird condition detection to obtain a detection result.
In this embodiment, the detection result refers to a position of a bird group obtained by detecting the image to be detected by the bird situation detection model and a bird group number grade corresponding to the position. Specifically, the detection result includes a level corresponding to the number of the bird groups and position information of the bird groups.
The bird condition detection model is obtained by training a deep learning neural network by taking a plurality of bird images with bird position labels and grade labels representing the number of birds as sample data.
In an embodiment, referring to fig. 3, the bird condition detection model is obtained by training the deep learning neural network using a plurality of bird images with bird position tags and grade tags indicating the number of birds as sample data, and may include steps S121 to S123.
And S121, acquiring bird images.
In this embodiment, bird images are obtained by collecting birds photos in different weather and different time in various ways such as capturing by a camera or a network.
And S122, marking the bird position label and the grade label representing the number of the birds on the bird image to obtain a sample data set.
In this embodiment, the sample data set refers to bird images with labels and data that can be used to train a deep learning neural network.
Bird images are collected and manually shot on the network, the bird images under various environmental backgrounds are collected as much as possible, and the bird images are marked and stored as xml data format files.
And marking the bird groups in the acquired bird images by using a rectangular frame, wherein the bird groups are marked to be divided into 6 grades which are respectively represented by red, orange, yellow, green, cyan and blue. The method comprises the steps of marking a single bird in a bird image as a blue level, marking a plurality of birds which are shielded or overlapped with each other and cannot mark data of the single bird according to quantity conditions as 2-10, 1-30, 31-60, 61-100 and more than 100, marking the birds as cyan, green, yellow, orange and red respectively, wherein different quantity ranges correspond to different levels, the levels are distinguished according to marking colors, in addition, position information of a bird group is required to be marked so as to facilitate subsequent bird identification, randomly dividing the marked bird image into a training set and a testing set according to a ratio of 9:1, and certainly, dividing the sample data set according to different ratios set according to actual requirements.
And S123, training the deep learning neural network by adopting the sample data set to obtain a bird condition detection model.
In this embodiment, the bird condition detection model is a trained model that can be used to directly detect bird conditions of the input image to obtain the level corresponding to the number of the bird groups and the positions of the bird groups.
In one embodiment, referring to fig. 4, the step S123 may include steps S1231 to S1236.
And S1231, dividing the sample data set into a training set and a test set.
In the present embodiment, the training set is image data used to train the YOLOV4 algorithm; the test set is image data used to test the trained YOLOV4 algorithm.
S1232, setting parameters for algorithm training of YOLOV 4;
and S1233, inputting the training set into a YOLOV4 algorithm for network model training to obtain an initial model.
In this embodiment, the initial model is trained by using the YOLOV4 algorithm in the deep learning neural network, and is trained by using the random gradient descent algorithm, and when the loss function value decreases to be stable, that is, when the loss value of the loss function tends to be stable, the training is stopped, and the trained model is saved.
For the already labeled training set, 9 sets of preselected boxes, respectively (10, 15), (18, 35), (43, 20), (33, 75), (70, 50), (65, 135), (120, 80), (186, 223), (333, 260), are designed as initialization parameters of the training model. The method comprises the steps of initializing a pre-training model of an open-source large visualization database as a parameter of a YOLOV4 algorithm, setting the training batch processing size to be 128, the learning rate to be 0.001, the iteration number to be 100000 and the data rotation angle to be 45 degrees.
And S1234, testing the initial model by using a test set to obtain a test result.
In this embodiment, the test result refers to a result obtained by testing the initial model with the test set.
S1235, judging whether the test result meets the requirement;
if the test result does not meet the requirement, executing the step S1232;
and S1236, if the test result meets the requirement, using the initial model as a bird condition detection model.
Specifically, the test result of the test set on the initial model is judged by using an mAP (mean Average Precision) index, and if the mAP is less than 0.95, the training parameter setting is modified or the data set is added for retraining until the requirement that the mAP is more than 0.95 is met. The mAP index is the average accuracy over one class, and then the average accuracy over all classes.
And (3) training by using a deep learning neural network, training by adopting a random gradient descent algorithm, testing and evaluating the initial model when the loss function value is reduced to be stable, and selecting the optimal model to form the bird condition detection model.
For example: if 20 cyan-level rectangular frames, namely 20 rectangular frames with one bird number and one rectangular frame with one green-level, namely 10-30 birds number are detected in one picture, the current detection level of the picture is the yellow level, namely the bird number is 30-60, and the output detection result is the yellow level.
And S130, performing bird condition analysis according to the detection result to obtain an analysis result.
In this embodiment, the analysis result refers to a result obtained by performing bird identification and bird ecological environment analysis according to the detection result.
In an embodiment, referring to fig. 5, the step S130 may include steps S131 to S134.
S131, intercepting a real-time image according to the detection result to obtain an intermediate image;
and S132, inputting the intermediate image into a bird recognition model for bird recognition to obtain a category.
In the present embodiment, the category is a category of a bird group in a live image obtained by shooting.
The bird identification model is obtained by training a convolutional neural network by using a plurality of images with bird category labels as sample data. Specifically, the training process of the bird recognition model may refer to the training process of the bird condition detection model, which is not described herein again.
And S133, establishing a corresponding bird condition database according to the categories.
If the birds are in the situation, the current intermediate image is saved, and bird identification and statistics are carried out. And when the bird condition occurs, intercepting a position picture of the bird, inputting a bird recognition model to perform bird recognition, counting the types of the birds appearing in the designated area, the appearing times and weather, and establishing a bird condition database.
And S134, analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
And judging the ecological environment of the birds through bird condition database analysis, and ecologically driving the birds. For example, whether the bird belongs to an insect eating bird or a seasonal migratory bird can be analyzed according to the type, the number, the season and the like of the birds appearing in a period of time, and bird repelling measures can be performed in a targeted manner according to the data. For birds foraging in airports, a method of killing insects can be adopted to carry out ecological bird repelling, so that the probability of birds appearing in airports is reduced.
And S140, sending a driving notice to the bird repelling equipment according to the analysis result so as to enable the bird repelling equipment to perform bird repelling operation.
And driving different bird repelling equipment to repel birds according to the bird condition grade, otherwise not driving the bird repelling equipment. Such as: if the bird repelling effect is a green grade, the ultrasonic equipment can be driven to repel birds, and if the bird repelling effect is a red grade, the ultrasonic equipment can be linked to emit specific sound wave frequency, so that a better bird repelling effect is achieved.
Classified data labeling is carried out according to bird conditions, not only single birds are labeled, data are trained in a multi-scale mode, the problem that the birds cannot be detected due to too small targets is solved, meanwhile, the problem that a plurality of birds cannot be detected due to overlapping shielding is also solved, accurate statistics can be carried out on the number of the current birds when the birds are not overlapped, the number range of the birds can be estimated when the birds are overlapped in a large amount, and therefore the bird conditions of the current conditions can be output more accurately. In addition, the current bird condition image can be captured and stored, a basis is provided for follow-up evidence query, bird identification is carried out, the research on the ecological environment of birds is facilitated, ecological bird repelling prevention is further well carried out, and meanwhile different bird repelling devices are driven according to the bird condition, so that a better bird repelling effect is achieved.
According to the bird condition detection method, the real-time image is obtained, and then the image is input into the bird condition detection model for bird condition detection, the bird condition detection model can detect a single bird, a plurality of birds and overlapped bird groups, estimate the number of the birds in the whole real-time image, analyze the bird condition according to the obtained detection result, and perform corresponding bird repelling operation according to the bird condition analysis result, so that the bird condition detection method can roughly estimate the number of the bird groups, accurately detect the bird condition of the current situation and improve the repelling effect.
Fig. 6 is a schematic block diagram of a bird situation detection apparatus 300 according to an embodiment of the present invention. As shown in fig. 6, the present invention also provides a bird condition detection apparatus 300 corresponding to the above bird condition detection method. The bird condition detection apparatus 300 includes a unit for performing the bird condition detection method described above, and the apparatus may be configured in a server. Specifically, referring to fig. 6, the bird's-situation detecting apparatus 300 includes a real-time image obtaining unit 301, a detecting unit 302, an analyzing unit 303, and a transmitting unit 304.
A real-time image obtaining unit 301, configured to obtain a real-time image to obtain an image to be detected; the detection unit 302 is configured to input the image to be detected into the bird condition detection model for bird condition detection to obtain a detection result; an analysis unit 303, configured to perform bird emotion analysis according to the detection result to obtain an analysis result; a sending unit 304, configured to send a driving notification to the bird repelling device according to the analysis result, so that the bird repelling device performs bird repelling operation.
In one embodiment, as shown in fig. 7, the analysis unit 303 includes an intercepting subunit 3031, an identifying subunit 3032, a creating subunit 3033, and an environment analysis subunit 3034.
An intercepting subunit 3031, configured to intercept the real-time image according to the detection result to obtain an intermediate image; an identification subunit 3032, configured to input the intermediate image into a bird identification model for bird identification to obtain a category; the establishing subunit 3033 is configured to establish a corresponding bird condition database according to the category; and the environment analysis subunit 3034 is configured to analyze the bird ecological environment of the bird condition database to obtain an analysis result.
The bird recognition model is obtained by training a convolutional neural network by using a plurality of images with bird category labels as sample data.
In an embodiment, the bird condition detection apparatus 300 further includes a detection model construction unit, and the detection model construction unit is configured to train the deep learning neural network by using a plurality of bird images with the bird position tags and the grade tags representing the number of birds as sample data to obtain the bird condition detection model.
In an embodiment, the detection model construction unit includes a bird image acquisition subunit, a label labeling subunit, and a training subunit.
The bird image acquisition subunit is used for acquiring bird images; the tag labeling subunit is used for labeling the bird position tags and the grade tags representing the number of the birds on the bird images to obtain a sample data set; and the training subunit is used for training the deep learning neural network by adopting the sample data set to obtain a bird condition detection model.
In one embodiment, the training subunit includes a dividing module, a parameter setting module, an initial model obtaining module, a testing module, and a result judging module.
The dividing module is used for dividing the sample data set into a training set and a test set; the parameter setting module is used for setting parameters for training the YOLOV4 algorithm; the initial model acquisition module is used for inputting a training set into a Yolov4 algorithm to train a network model so as to obtain an initial model; the test module is used for testing the initial model by adopting a test set to obtain a test result; the result judging module is used for judging whether the test result meets the requirement or not; if the test result does not meet the requirement, executing the parameter for setting the YOLOV4 algorithm training; and if the test result meets the requirement, taking the initial model as a bird condition detection model.
It should be noted that, as can be clearly understood by those skilled in the art, the detailed implementation process of the bird condition detection apparatus 300 and each unit may refer to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, no further description is provided herein.
The bird condition detection apparatus 300 may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 8.
Referring to fig. 8, fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 is a server, wherein the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 8, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032 comprises program instructions that, when executed, cause the processor 502 to perform a bird detection method.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the operation of the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 can be enabled to execute a bird detection method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the configuration shown in fig. 8 is a block diagram of only a portion of the configuration relevant to the present teachings and does not constitute a limitation on the computer device 500 to which the present teachings may be applied, and that a particular computer device 500 may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following steps:
acquiring a real-time image to obtain an image to be detected; inputting the image to be detected into a bird condition detection model for bird condition detection to obtain a detection result; performing bird condition analysis according to the detection result to obtain an analysis result; sending a driving notice to the bird repelling equipment according to the analysis result so as to enable the bird repelling equipment to perform bird repelling operation; the bird condition detection model is obtained by training a deep learning neural network by taking a plurality of bird images with bird position labels and grade labels representing the number of birds as sample data.
The detection result comprises the grade corresponding to the number of the bird groups and the position information of the bird groups.
In an embodiment, when implementing the bird condition detection model as a step of training a deep learning neural network by using a plurality of bird images with bird position tags and grade tags representing the number of birds as a sample data set, the processor 502 specifically implements the following steps:
acquiring an image of the bird; marking a bird position label and a grade label representing the number of birds on a bird image to obtain a sample data set; and training the deep learning neural network by adopting a sample data set to obtain a bird condition detection model.
In an embodiment, when the processor 502 implements the step of training the deep learning neural network by using the sample data set to obtain the bird condition detection model, the following steps are specifically implemented:
dividing a sample data set into a training set and a test set; setting parameters for training the YOLOV4 algorithm; inputting the training set into a YOLOV4 algorithm to train a network model so as to obtain an initial model; testing the initial model by adopting a test set to obtain a test result; judging whether the test result meets the requirement or not; if the test result does not meet the requirement, executing the parameter for setting the YOLOV4 algorithm training; and if the test result meets the requirement, taking the initial model as a bird condition detection model.
In an embodiment, when the processor 502 implements the step of performing bird emotion analysis according to the detection result to obtain an analysis result, the following steps are specifically implemented:
intercepting a real-time image according to the detection result to obtain an intermediate image; inputting the intermediate image into a bird recognition model for bird recognition to obtain a category; establishing a corresponding bird condition database according to the categories; and analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
The bird identification model is obtained by training a convolutional neural network by using a plurality of images with bird category labels as sample data.
It should be understood that, in the embodiment of the present Application, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be understood by those skilled in the art that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program instructing associated hardware. The computer program includes program instructions, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer-readable storage medium. The storage medium stores a computer program, wherein the computer program, when executed by a processor, causes the processor to perform the steps of:
acquiring a real-time image to obtain an image to be detected; inputting the image to be detected into a bird condition detection model for bird condition detection to obtain a detection result; performing bird condition analysis according to the detection result to obtain an analysis result; sending a driving notice to the bird repelling equipment according to the analysis result so as to enable the bird repelling equipment to perform bird repelling operation; the bird condition detection model is obtained by training a deep learning neural network by taking a plurality of bird images with bird position labels and grade labels representing the number of birds as sample data.
The detection result comprises the grade corresponding to the number of the bird groups and the position information of the bird groups.
In an embodiment, when the processor executes the computer program to implement the step of training the deep learning neural network by using a plurality of bird images with bird position tags and grade tags representing the number of birds as sample data sets, the processor specifically implements the following steps:
acquiring an image of the bird; marking a bird position label and a grade label representing the number of birds on a bird image to obtain a sample data set; and training the deep learning neural network by adopting a sample data set to obtain a bird condition detection model.
In an embodiment, when the processor executes the computer program to implement the step of training the deep learning neural network by using the sample data set to obtain the bird condition detection model, the following steps are specifically implemented:
dividing a sample data set into a training set and a test set; setting parameters for training the YOLOV4 algorithm; inputting the training set into a YOLOV4 algorithm to train a network model so as to obtain an initial model; testing the initial model by adopting a test set to obtain a test result; judging whether the test result meets the requirement or not; if the test result does not meet the requirement, executing the parameter for setting the YOLOV4 algorithm training; and if the test result meets the requirement, taking the initial model as a bird condition detection model.
In an embodiment, when the processor executes the computer program to implement the step of performing bird emotion analysis according to the detection result to obtain an analysis result, the following steps are specifically implemented:
intercepting a real-time image according to the detection result to obtain an intermediate image; inputting the intermediate image into a bird recognition model for bird recognition to obtain a category; establishing a corresponding bird condition database according to the categories; and analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
The bird recognition model is obtained by training a convolutional neural network by using a plurality of images with bird category labels as sample data.
The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, which can store various computer readable storage media.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be merged, divided and deleted according to actual needs. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A bird condition detection method is characterized by comprising the following steps:
acquiring a real-time image to obtain an image to be detected;
inputting the image to be detected into a bird condition detection model for bird condition detection to obtain a detection result;
performing bird condition analysis according to the detection result to obtain an analysis result;
sending a driving notice to the bird repelling equipment according to the analysis result so as to enable the bird repelling equipment to perform bird repelling operation;
the bird condition detection model is obtained by training a deep learning neural network by taking a plurality of bird images with bird position labels and grade labels representing the number of birds as sample data.
2. The bird condition detection method according to claim 1, wherein the bird condition detection model is obtained by training a deep learning neural network using a plurality of bird images with a bird position tag and a rank tag indicating the number of birds as sample data sets, and includes:
acquiring an image of the bird;
marking a bird position label and a grade label representing the number of birds on a bird image to obtain a sample data set;
and training the deep learning neural network by adopting a sample data set to obtain a bird condition detection model.
3. The method according to claim 2, wherein training the deep learning neural network with the sample data set to obtain a bird detection model comprises:
dividing a sample data set into a training set and a test set;
setting parameters for training the YOLOV4 algorithm;
inputting the training set into a YOLOV4 algorithm to train a network model so as to obtain an initial model;
testing the initial model by adopting a test set to obtain a test result;
judging whether the test result meets the requirement or not;
if the test result does not meet the requirement, executing the parameter for setting the YOLOV4 algorithm training;
and if the test result meets the requirement, taking the initial model as a bird condition detection model.
4. The bird condition detection method according to claim 1, wherein the detection result includes a rank corresponding to the number of the bird group and position information of the bird group.
5. The bird condition detection method according to claim 4, wherein performing bird condition analysis based on the detection result to obtain an analysis result comprises:
intercepting a real-time image according to the detection result to obtain an intermediate image;
inputting the intermediate image into a bird recognition model for bird recognition to obtain a category;
establishing a corresponding bird condition database according to the categories;
and analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
6. The bird condition detection method according to claim 5, wherein the bird recognition model is obtained by training a convolutional neural network using a plurality of images with bird category labels as sample data.
7. Bird condition detection device, its characterized in that includes:
the real-time image acquisition unit is used for acquiring a real-time image to obtain an image to be detected;
the detection unit is used for inputting the image to be detected into the bird condition detection model for bird condition detection to obtain a detection result;
the analysis unit is used for carrying out bird condition analysis according to the detection result so as to obtain an analysis result;
and the sending unit is used for sending a driving notice to the bird repelling device according to the analysis result so as to ensure that the bird repelling device carries out bird repelling operation.
8. The bird situation detection apparatus according to claim 7, wherein the analysis unit includes:
the intercepting subunit is used for intercepting the real-time image according to the detection result so as to obtain an intermediate image;
the identification subunit is used for inputting the intermediate image into a bird identification model for bird identification to obtain a category;
the establishing subunit is used for establishing a corresponding bird condition database according to the categories;
and the environment analysis subunit is used for analyzing the bird ecological environment of the bird condition database to obtain an analysis result.
9. A computer device, characterized in that the computer device comprises a memory, on which a computer program is stored, and a processor, which when executing the computer program implements the method according to any of claims 1 to 6.
10. A storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 6.
CN202010558302.6A 2020-06-18 2020-06-18 Bird condition detection method, bird condition detection device, computer equipment and storage medium Active CN111709374B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010558302.6A CN111709374B (en) 2020-06-18 2020-06-18 Bird condition detection method, bird condition detection device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010558302.6A CN111709374B (en) 2020-06-18 2020-06-18 Bird condition detection method, bird condition detection device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111709374A true CN111709374A (en) 2020-09-25
CN111709374B CN111709374B (en) 2023-06-27

Family

ID=72540850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010558302.6A Active CN111709374B (en) 2020-06-18 2020-06-18 Bird condition detection method, bird condition detection device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111709374B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184322A (en) * 2020-10-13 2021-01-05 中国农业银行股份有限公司四川省分行 Live livestock mortgage loan pre-loan valuation method based on picture recognition
CN112560675A (en) * 2020-12-15 2021-03-26 三峡大学 Bird visual target detection method combining YOLO and rotation-fusion strategy
CN112633375A (en) * 2020-12-23 2021-04-09 深圳市赛为智能股份有限公司 Bird detection method and device, computer equipment and storage medium
CN112668444A (en) * 2020-12-24 2021-04-16 南京泓图人工智能技术研究院有限公司 Bird detection and identification method based on YOLOv5
CN113076860A (en) * 2021-03-30 2021-07-06 南京大学环境规划设计研究院集团股份公司 Bird detection system under field scene
CN113255691A (en) * 2021-04-15 2021-08-13 南昌大学 Method for detecting and identifying harmful bird species target of bird-involved fault of power transmission line
CN113469014A (en) * 2021-06-29 2021-10-01 智洋创新科技股份有限公司 Deep learning-based bird hidden danger prevention and control method for power transmission line
CN114387510A (en) * 2021-12-22 2022-04-22 广东工业大学 Bird identification method and device for power transmission line and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108710126A (en) * 2018-03-14 2018-10-26 上海鹰觉科技有限公司 Automation detection expulsion goal approach and its system
CN109033975A (en) * 2018-06-27 2018-12-18 山东大学 Birds detection, identification and method for tracing and device in a kind of monitoring of seashore
CN110210577A (en) * 2019-06-17 2019-09-06 重庆英卡电子有限公司 A kind of deep learning and recognition methods for intensive flock of birds
CN110889841A (en) * 2019-11-28 2020-03-17 江苏电力信息技术有限公司 YOLOv 3-based bird detection algorithm for power transmission line
CN110969107A (en) * 2019-11-25 2020-04-07 上海交通大学 Bird population identification analysis method and system based on network model

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108710126A (en) * 2018-03-14 2018-10-26 上海鹰觉科技有限公司 Automation detection expulsion goal approach and its system
CN109033975A (en) * 2018-06-27 2018-12-18 山东大学 Birds detection, identification and method for tracing and device in a kind of monitoring of seashore
CN110210577A (en) * 2019-06-17 2019-09-06 重庆英卡电子有限公司 A kind of deep learning and recognition methods for intensive flock of birds
CN110969107A (en) * 2019-11-25 2020-04-07 上海交通大学 Bird population identification analysis method and system based on network model
CN110889841A (en) * 2019-11-28 2020-03-17 江苏电力信息技术有限公司 YOLOv 3-based bird detection algorithm for power transmission line

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184322A (en) * 2020-10-13 2021-01-05 中国农业银行股份有限公司四川省分行 Live livestock mortgage loan pre-loan valuation method based on picture recognition
CN112560675A (en) * 2020-12-15 2021-03-26 三峡大学 Bird visual target detection method combining YOLO and rotation-fusion strategy
CN112560675B (en) * 2020-12-15 2022-06-21 三峡大学 Bird visual target detection method combining YOLO and rotation-fusion strategy
CN112633375A (en) * 2020-12-23 2021-04-09 深圳市赛为智能股份有限公司 Bird detection method and device, computer equipment and storage medium
CN112668444A (en) * 2020-12-24 2021-04-16 南京泓图人工智能技术研究院有限公司 Bird detection and identification method based on YOLOv5
CN113076860A (en) * 2021-03-30 2021-07-06 南京大学环境规划设计研究院集团股份公司 Bird detection system under field scene
CN113076860B (en) * 2021-03-30 2022-02-25 南京大学环境规划设计研究院集团股份公司 Bird detection system under field scene
CN113255691A (en) * 2021-04-15 2021-08-13 南昌大学 Method for detecting and identifying harmful bird species target of bird-involved fault of power transmission line
CN113469014A (en) * 2021-06-29 2021-10-01 智洋创新科技股份有限公司 Deep learning-based bird hidden danger prevention and control method for power transmission line
CN114387510A (en) * 2021-12-22 2022-04-22 广东工业大学 Bird identification method and device for power transmission line and storage medium

Also Published As

Publication number Publication date
CN111709374B (en) 2023-06-27

Similar Documents

Publication Publication Date Title
CN111709374B (en) Bird condition detection method, bird condition detection device, computer equipment and storage medium
Aquino et al. vitisBerry: An Android-smartphone application to early evaluate the number of grapevine berries by means of image analysis
CN110046631B (en) System and method for automatically inferring changes in spatiotemporal images
Aquino et al. A new methodology for estimating the grapevine-berry number per cluster using image analysis
US10058076B2 (en) Method of monitoring infectious disease, system using the same, and recording medium for performing the same
CN111709421B (en) Bird identification method, bird identification device, computer equipment and storage medium
CN111709372B (en) Bird repelling method and device, computer equipment and storage medium
WO2019214309A1 (en) Model test method and device
CN108615046A (en) A kind of stored-grain pests detection recognition methods and device
CN109492665A (en) Detection method, device and the electronic equipment of growth period duration of rice
Alharbi et al. Automatic counting of wheat spikes from wheat growth images
JP2018512567A (en) Barcode tag detection in side view sample tube images for laboratory automation
CN112069985A (en) High-resolution field image rice ear detection and counting method based on deep learning
CN115482465A (en) Crop disease and insect pest prediction method and system based on machine vision and storage medium
Ye et al. An image-based approach for automatic detecting tasseling stage of maize using spatio-temporal saliency
CN111325181B (en) State monitoring method and device, electronic equipment and storage medium
US9953238B2 (en) Image processing method and system for extracting distorted circular image elements
CN113869327A (en) Data processing method and system based on soil element content detection
CN108229467B (en) Method and device for interpreting remote sensing image and electronic equipment
Pintor et al. Govocitos: A software tool for estimating fish fecundity based on digital analysis of histological images
CN112633375A (en) Bird detection method and device, computer equipment and storage medium
Bereciartua-Pérez et al. Multiclass insect counting through deep learning-based density maps estimation
CN113837242A (en) Method and system for detecting content of soil elements
CN114169404A (en) Method for intelligently acquiring quantitative information of slope diseases based on images
CN112308061A (en) License plate character sorting method, recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant