CN110210577B - Deep learning and identification method for dense bird group - Google Patents

Deep learning and identification method for dense bird group Download PDF

Info

Publication number
CN110210577B
CN110210577B CN201910522144.6A CN201910522144A CN110210577B CN 110210577 B CN110210577 B CN 110210577B CN 201910522144 A CN201910522144 A CN 201910522144A CN 110210577 B CN110210577 B CN 110210577B
Authority
CN
China
Prior art keywords
bird
bird group
probability density
neural network
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910522144.6A
Other languages
Chinese (zh)
Other versions
CN110210577A (en
Inventor
唐灿
江朝元
曹晓莉
封强
柳荣星
孙雨桐
刘崇科
马吉刚
彭鹏
李靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Intercontrol Electronics Co ltd
Original Assignee
Chongqing Intercontrol Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Intercontrol Electronics Co ltd filed Critical Chongqing Intercontrol Electronics Co ltd
Priority to CN201910522144.6A priority Critical patent/CN110210577B/en
Publication of CN110210577A publication Critical patent/CN110210577A/en
Application granted granted Critical
Publication of CN110210577B publication Critical patent/CN110210577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a deep learning and identification method for a dense bird group, which comprises a probability density map generation process and a training process of a full convolution neural network; the probability density chart generation flow comprises the following steps: inputting the bird group photo into a bird group photo set; formulating a color table; marking all the birds in the bird group photo by dotting; converting the bird swarm photograph into a continuous density function using a gaussian convolution; looking up a table for mapping the continuous density function and the color table to obtain a corresponding probability density map A; the training process of the full convolution neural network comprises the following steps: carrying out image adding processing on the bird group photos; obtaining a bird group image; establishing an FCNN full convolution neural network; acquiring a corresponding loss function; inputting the bird group image into a neural network to obtain a probability density map B; and calculating a function value of the corresponding loss function to obtain a corresponding weight value. The method has the remarkable effect that the image training is carried out by using the deep learning technology based on the probability, so that the number of birds is estimated.

Description

Deep learning and identification method for dense bird group
Technical Field
The invention relates to the technical field of bird identification, in particular to a deep learning and identification method for a dense bird group.
Background
In many wild bird protection areas in China, birds struggle in groups each year overwinter, multiply and inhabit there. They are often densely populated, crowded with one another for rest, foraging, talking, and flying, sometimes over the sky. Tracking and statistics of birds become an issue that natural conservation zones cannot be circumvented.
The prior art has the defect that the traditional statistical method is basically incapable of identifying and counting dense bird groups, and people often count by trying to take pictures. Due to the large number of photos and the very dense bird data in the photos, the counts are shown to be essentially unrealistic. In recent years, deep learning algorithms have been used for bird identification, and they mostly count birds in photographs by division counting. However, when the bird groups are too dense, correct identification cannot be achieved due to mutual occlusion of the bird groups, and even manual segmentation is very difficult, which cannot be basically achieved by the existing machine segmentation.
Disclosure of Invention
In view of at least one of the defects of the prior art, the invention aims to provide a deep learning and identification method for dense bird groups, which uses a deep learning technology based on probability to perform image training to obtain a corresponding probability density map, thereby estimating the number of birds and solving the problems of identification and counting of the dense bird groups.
In order to achieve the purpose, the invention adopts the following technical scheme: the deep learning method for the dense bird group is characterized by comprising a probability density map generation process of a bird group photo and a training process of a full convolution neural network;
the generation flow of the probability density chart of the bird group photo comprises the following steps:
step A1: establishing a bird group photo set, and inputting the bird group photo into the bird group photo set;
step A2: formulating a color table, wherein the color table is provided with corresponding bird density numerical values;
step A3: sequentially taking out the pictures of the bird group from the picture set of the bird group;
step A4: marking all the birds in the photos of the bird group by using a marking tool;
step A5: converting the marked images of the bird population into a continuous density function using an adaptive gaussian convolution;
step A6: performing table look-up mapping on the numerical value of the continuous density function and the bird density numerical value of the color table to obtain a corresponding visual probability density map A;
step A7: and judging whether the shot of the unmarked bird group still exists, and if the shot of the unmarked bird group still exists, turning to the step A3, and ending the process.
The training process of the full convolution neural network comprises the following steps:
step B1: acquiring a marked bird group photo set;
step B2: performing image adding processing on the bird group photos in the bird group photo set; obtaining an image of the bird group with increased processing;
step B3: establishing a self-defined FCNN full convolution neural network;
step B4: obtaining a loss function corresponding to the FCNN full convolution neural network;
step B5: determining a termination loss function threshold value and an iteration number threshold value of a loss function;
step B6: sequentially taking out bird group images from the bird group photo set, and inputting the bird group images into an FCNN full convolution neural network to obtain a probability density graph B;
step B7: calculating a function value of a corresponding loss function according to the probability density graph B and the corresponding probability density graph A, and training an FCNN full convolution neural network to obtain a corresponding weight value;
step B8: judging whether the function value of the loss function is smaller than the termination loss function value, if so, turning to a step B10, and if not, turning to a step B9;
step B9: judging whether the iteration times of the loss function is greater than an iteration time threshold value or not; if yes, go to step B10, otherwise go to step B6;
step B10: the weight value is saved as the final weight value.
And densely labeling the birds by using a probability density map mode through a probability density map generation process of the bird swarm photos, and changing the bird swarm photos into the probability density maps.
Deep recognition is carried out on the basis of the probability density map through the training process of the full convolution neural network, the density map is generated by using a deep neural network algorithm, and the result generated by the neural network is also the density map. And obtaining the final weight value of the FCNN full convolution neural network.
The key point of the identification method for the dense bird group is that,
the method comprises the following steps:
step C1: acquiring a new bird group photo;
step C2: processing the new bird group photo by using the FCNN full convolution neural network and the final weight value to generate a corresponding probability density graph F;
step C3: the probability density map F is counted.
The key point of the identification method for the dense bird group is that the step C3 counts the probability density map F by the following formula (6);
C(F)=∫Hf(x)dx (6)
wherein F represents the probability density chart of the new bird group photo, F (x) represents the probability density of any point in the F chart, x represents any point in the F chart, and finally the value of C (F) is output as the counting result.
Through the setting of the identification method, the density map is converted into statistics through integration of the density map; in the method, a user-defined special convolutional neural network is used to achieve a faster effect. And the bird identification and counting results are given at one time.
The method has the advantages that the deep learning and identification method for the dense bird group is used for carrying out image training by using the deep learning technology based on the probability to obtain the corresponding probability density map, so that the number of birds is estimated, and the problems of identification and counting of the dense bird group are solved.
Drawings
FIG. 1 is a flow chart of a process for generating a probability density map of a photograph of a group of birds;
FIG. 2 is a flow chart of a training process for a full convolutional neural network;
FIG. 3 is a flow chart of a method of identifying a dense cluster of birds;
FIG. 4 is a schematic diagram of an embodiment of the present method;
FIG. 5 is a schematic diagram of a color table;
FIG. 6 is a schematic view of a horizontal flip of an avian photograph;
fig. 7 is a schematic diagram of cropping of an avian photograph.
Detailed Description
The invention is described in further detail below with reference to the figures and specific examples.
As shown in fig. 1-7, a deep learning and identification method for dense bird groups includes a probability density map generation process of a bird group photo and a training process of a full convolution neural network;
the generation flow of the probability density chart of the bird group photo comprises the following steps:
step A1: establishing a bird group photo set, and inputting the bird group photo into the bird group photo set;
step A2: formulating a color table, wherein the color table is provided with corresponding bird density numerical values;
step A3: sequentially taking out the pictures of the bird group from the picture set of the bird group;
step A4: marking all the birds in the photos of the bird group by using a marking tool;
inputting a bird group photo, manually marking the central part of the body trunk of all birds or selecting an approximate center which can be marked due to shielding by using an image marking tool to perform dotting marking, wherein each bird has one point, and providing various corresponding labels of the birds; JSON examples of its data are as follows:
[ { "white swan", 100,201}, { "red mouth gull", 122,231}, … ]
Pictures of bird groups come from the network and pictures of the bird group taken in the natural reserve area;
step A5: converting the marked images of the bird population into a continuous density function using an adaptive gaussian convolution;
step A6: performing table look-up mapping on the numerical value of the continuous density function and the bird density numerical value of the color table to obtain a corresponding visual probability density map A;
step A7: and judging whether the shot of the unmarked bird group still exists, and if the shot of the unmarked bird group still exists, turning to the step A3, and ending the process.
The training process of the full convolution neural network comprises the following steps:
step B1: acquiring a marked bird group photo set;
step B2: performing image adding processing on the bird group photos in the bird group photo set; obtaining an image of the bird group with increased processing;
step B3: establishing a self-defined FCNN full convolution neural network;
step B4: obtaining a loss function corresponding to the FCNN full convolution neural network;
step B5: determining a termination loss function threshold value and an iteration number threshold value of a loss function;
step B6: sequentially taking out bird group images from the bird group photo set, and inputting the bird group images into an FCNN full convolution neural network to obtain a probability density graph B;
step B7: calculating a function value of a corresponding loss function according to the probability density graph B and the corresponding probability density graph A, and training an FCNN full convolution neural network to obtain a corresponding weight value;
step B8: judging whether the function value of the loss function is smaller than the termination loss function value, if so, turning to a step B10, and if not, turning to a step B9;
step B9: judging whether the iteration times of the loss function is greater than an iteration time threshold value or not; if yes, go to step B10, otherwise go to step B6;
step B10: the weight value is saved as the final weight value.
And densely labeling the birds by using a probability density map mode through a probability density map generation process of the bird swarm photos, and changing the bird swarm photos into the probability density maps.
Deep recognition is carried out on the basis of the probability density map through the training process of the full convolution neural network, the density map is generated by using a deep neural network algorithm, and the result generated by the neural network is also the density map.
The deep learning method for the dense bird group is characterized in that,
and B, changing the color of the color chart formulated in the step A2 from light to dark, and correspondingly setting bird density numerical values from low to high.
The darker the color is, the larger the bird density value is;
the deep learning method for the dense bird group is characterized in that,
the step A5 includes:
step A51: representing the marked images of the bird group by using a bird group image function H (x);
by xiThe central coordinate position of a mark point of a certain bird in the image of the bird group can be represented by a function H (x) of the image of the bird group for one image of the bird group with N bird marks;
Figure BDA0002097048210000061
in the formula (1), x represents an arbitrary image point in the bird group photograph, and δ represents the central coordinate position x of the mark pointiA simple shock function of; the meaning is that the integration result according to the central mark point is 1, and the simple expression is as follows: we discretize the marker points into a probability function centered on the marker point, the closer to the center, the more likely it is to be a bird. Using this function, we can already represent generic bird images, but we have not considered perspective issues.
Step A52: converting a bird population image function h (x) to a continuous density function using adaptive gaussian convolution;
the probability density of an image can be serialized by performing a gaussian convolution on the image. Due to the perspective property of the photographed image, it exhibits a near-far nature. Then the image can be convolution corrected using adaptive gaussian convolution;
the formula of the adaptive gaussian convolution is:
Figure BDA0002097048210000071
wherein in the formula (2), F (x) is a continuous density function, σiIs the gaussian kernel standard deviation;
Figure BDA0002097048210000072
in the formula (3), beta is a hyper-parameter, and for the bird body, the value of beta is 0.37;
Figure BDA0002097048210000073
center coordinate position x representing mark pointiThe average of the sum of euclidean distances from its k neighboring marker points,
Figure BDA0002097048210000074
wherein k in formula (4) represents the center coordinate position xiK adjacent mark points are provided; when the number of birds in the photo is too small, the photo can be used
Figure BDA0002097048210000075
Is too large, so
Figure BDA0002097048210000076
Birds within 576 pixels (i.e., 24X24) are scanned.
dj iRepresenting the position x of the central coordinateiThe euclidean distance from the jth marker of its k adjacent markers.
In step a6, table look-up mapping is performed on the value of the continuous density function f (x) and the bird density value of the color table to obtain a corresponding visualized probability density map a represented by color.
The key point of the deep learning method for the dense bird group is that the step B2 includes,
step B21: horizontally turning the bird group photos in the bird group photo set;
the pictures of the bird group are interchanged from side to side by taking the vertical central line as the center. The purpose of this is to: the photo album is enlarged.
Step B22: and (3) cutting each horizontally turned bird group photo for 4 times to obtain 4 image sub-blocks, wherein each image sub-block is 1/4 of the original bird group photo, removing the original bird group photo, and adding the cut image sub-blocks serving as bird group images into a bird group photo set.
The process is called data enhancement, and after the original bird group photo is removed, the bird group image is added into the bird group photo set; the purpose of this step is to overcome the problem of insufficient images from our image library.
The key point of the deep learning method for the dense bird group is that the network structure table of the FCNN full convolution neural network in the step B3 is
Figure BDA0002097048210000091
Wherein: conv3 denotes the use of a3 × 3 convolution, step2 denotes a hole convolution with step size 2; conv 3512 indicates that the number of convolutions is 512, which results in 512 dimensions. 64. 128, 256 each represent the number of convolutions.
The front part of the network uses the ResNET18 network as a backbone network to extract features. In the later stage, a new 7-layer convolution is customized, the receptive field is further enlarged, and the capability of extracting features is further enlarged.
The deep learning method for the dense bird group is characterized in that the corresponding loss function of the FCNN full convolution neural network is as follows:
Figure BDA0002097048210000092
wherein in the formula (5), N represents the number of marked points in the bird group photo, and XiRepresenting the markers of the birds obtained after the FCNN full convolution neural network processing of the images of the group of birds, FiRepresenting a continuous density function of an image of a group of birds. The continuous density function represents a probability density map obtained by dividing the probability density map obtained in the probability density map generation flow, and is represented by a continuous density function FiAnd (4) showing. L (Θ) is the loss function. Namely: the loss function is the square of the L2 norm of all the generated probability density points Θ to the corresponding probability density map points Xi. If the generated probability density map coincides with the probability density map generated in our step one, the loss function is 0.
By using the FCNN and the loss function, the input enhanced image and the corresponding probability density map can be trained, reversely derived and continuously approximated to obtain the corresponding weight value. The initial learning rate is 0.01, the learning rate is reduced ten times after every 20 cycles, and is not reduced after 0.0001, so that the learning can be faster and more accurate.
The deep learning identification method for the dense bird group is characterized in that the termination loss function threshold is 0.01; the threshold number of iterations is 500.
And continuously taking the enhanced image from the bird group photo set for iteration, and stopping and saving the weight if the function value of the loss function is less than 0.01 or the iteration times exceed 500.
The key point of the identification method for the dense bird group is that,
the method comprises the following steps:
step C1: acquiring a new bird group photo;
step C2: processing the new bird group photo by using the FCNN full convolution neural network and the final weight value to generate a corresponding probability density graph F;
step C3: the probability density map F is counted.
The key point of the identification method for the dense bird group is that the step C3 counts the probability density map by using the following formula (6);
C(F)=∫Hf(x)dx (6)
wherein F represents the probability density chart of the new bird group photo, F (x) represents the probability density of any point in the F chart, x represents any point in the F chart, and finally the value of C (F) is output as the counting result.
The above formula generates a count result c (F) by integrating the generated probability density map F.
Once the weight value is obtained, a new bird photo can be taken for inference, a corresponding probability density graph F is generated, and then counting is carried out.
Through the setting of the identification method, the density map is converted into statistics through integration of the density map; in the method, a user-defined special convolutional neural network is used to achieve a faster effect. And the bird identification and counting results are given at one time.
As shown in fig. 4, which is a specific use diagram of the method, step 1 is to obtain a bird group data set, step2 is to label a bird group picture, step 3 is to generate a bird probability density map, step 4 is to perform enhancement processing on the bird group data set, step 5 is to process an enhanced image by using a self-defined FCNN convolutional neural network, step 6 is to integrate the probability density map, and step 7 is to output bird data.
As shown in fig. 5, a schematic diagram of a color table is shown, wherein colors are generated by interpolation of the above several points, and the deeper the color is seen by human eyes, the more backward the color is expressed as: the denser the birds. Wherein the right end of the color table is dark red with color [0.5,0,0], which indicates that birds are denser, and is represented by probability 1. The left end [0,0,0.5] indicates a dark blue color, indicating scarcity of birds, represented by probability 0.
The invention abandons the segmentation identification of birds to count the birds, and uses the probability density map to count the birds; when a probability density map is generated, a self-adaptive Gaussian convolution kernel is selected to solve the problem of the near and far, and for a bird body, the value of beta is 0.37; the user-defined special full convolution neural network is used, the ResNet18 is selected as a main network of the neural network, and hole convolution is added, so that the precision and the speed are better; the final bird population count is obtained by integrating the density map.
The invention has the following characteristics:
the method comprises the following steps that firstly, in the first stage, birds are densely labeled in a probability density graph mode, so that a dense bird swarm graph is changed into a probability density graph; for the bird body, our value for β is 0.37.
Secondly, performing deep recognition on the basis of the probability density map of the first stage, and generating a density map by using a deep neural network algorithm, wherein the result generated by the neural network is also the density map;
thirdly, converting the density map into statistics by integrating the density map;
and fourthly, in the method, a user-defined special convolutional neural network is used to achieve a faster effect. Moreover, the result of whether birds and statistics are presented at one time;
finally, it is noted that: the above-mentioned embodiments are only examples of the present invention, and it is a matter of course that those skilled in the art can make modifications and variations to the present invention, and it is considered that the present invention is protected by the modifications and variations if they are within the scope of the claims of the present invention and their equivalents.

Claims (9)

1. A deep learning method for dense bird groups is characterized by comprising a probability density map generation process of a bird group photo and a training process of a full convolution neural network;
the generation flow of the probability density chart of the bird group photo comprises the following steps:
step A1: establishing a bird group photo set, and inputting the bird group photo into the bird group photo set;
step A2: formulating a color table, wherein the color table is provided with corresponding bird density numerical values;
step A3: sequentially taking out the pictures of the bird group from the picture set of the bird group;
step A4: marking all the birds in the photos of the bird group by using a marking tool;
step A5: converting the marked images of the bird population into a continuous density function using an adaptive gaussian convolution;
step A6: performing table look-up mapping on the numerical value of the continuous density function and the bird density numerical value of the color table to obtain a corresponding visual probability density map A;
step A7: judging whether an unlabeled bird group photo exists, if so, turning to the step A3, and if not, ending;
the training process of the full convolution neural network comprises the following steps:
step B1: acquiring a marked bird group photo set;
step B2: performing image adding processing on the bird group photos in the bird group photo set; obtaining an image of the bird group with increased processing;
step B3: establishing a self-defined FCNN full convolution neural network;
step B4: obtaining a loss function corresponding to the FCNN full convolution neural network;
step B5: determining a termination loss function threshold value and an iteration number threshold value of a loss function;
step B6: sequentially taking out bird group images from the bird group photo set, and inputting the bird group images into an FCNN full convolution neural network to obtain a probability density graph B;
step B7: calculating a function value of a corresponding loss function according to the probability density graph B and the corresponding probability density graph A, and training an FCNN full convolution neural network to obtain a corresponding weight value;
step B8: judging whether the function value of the loss function is smaller than the termination loss function value, if so, turning to a step B10, and if not, turning to a step B9;
step B9: judging whether the iteration times of the loss function is greater than an iteration time threshold value or not; if yes, go to step B10, otherwise go to step B6;
step B10: the weight value is saved as the final weight value.
2. The deep learning method for dense bird groups according to claim 1,
and B, changing the color of the color chart formulated in the step A2 from light to dark, and correspondingly setting bird density numerical values from low to high.
3. The deep learning method for dense bird groups according to claim 1,
the step A5 includes:
step A51: representing the marked images of the bird group by using a bird group image function H (x);
by xiThe central coordinate position of a mark point of a certain bird in the image of the bird group can be represented by a function H (x) of the image of the bird group for one image of the bird group with N bird marks;
Figure FDA0002762465350000021
in the formula (1), x represents an arbitrary image point in the bird group photograph, and δ represents the central coordinate position x of the mark pointiA simple shock function of;
step A52: converting a bird population image function h (x) to a continuous density function using adaptive gaussian convolution;
the formula of the adaptive gaussian convolution is:
Figure FDA0002762465350000031
wherein in the formula (2), F (x) is a continuous density function, σiIs the gaussian kernel standard deviation;
Figure FDA0002762465350000032
in the formula (3), beta is a hyperparameter, and the value of beta is 0.37;
Figure FDA0002762465350000033
center coordinate position x representing mark pointiThe average of the sum of euclidean distances from its k neighboring marker points,
Figure FDA0002762465350000034
wherein k in formula (4) represents the center coordinate position xiK adjacent mark points are provided;
in step a6, table look-up mapping is performed on the value of the continuous density function f (x) and the bird density value of the color table to obtain a corresponding visualized probability density map a represented by color.
4. The method for deep learning of dense bird groups according to claim 1, wherein said step B2 includes,
step B21: horizontally turning the bird group photos in the bird group photo set;
step B22: and (3) cutting each horizontally turned bird group photo for 4 times to obtain 4 image sub-blocks, wherein each image sub-block is 1/4 of the original bird group photo, removing the original bird group photo, and adding the cut image sub-blocks serving as bird group images into a bird group photo set.
5. The method as claimed in claim 4, wherein the FCNN full convolution neural network of step B3 has a network structure table of
Figure FDA0002762465350000041
6. The method of claim 4, wherein the corresponding loss function of the FCNN full convolution neural network is:
Figure FDA0002762465350000042
wherein in the formula (5), N represents the number of marked points in the bird group photo, and XiRepresenting the points of the bird, F, obtained after processing by the FCNN full convolution neural networkiRepresenting a continuous density function of an image of a group of birds.
7. The method of claim 4, wherein the termination loss function threshold is 0.01; the threshold number of iterations is 500.
8. The deep learning method for dense bird groups according to claim 1,
the method comprises the following steps:
step C1: acquiring a new bird group photo;
step C2: processing the new bird group photo by using the FCNN full convolution neural network and the final weight value to generate a corresponding probability density graph F;
step C3: the probability density map F is counted.
9. The method for deep learning of dense bird groups as claimed in claim 8, wherein said step C3 is implemented by counting probability density maps using the following formula (6);
C(F)=∫Hf(x)dx (6)
wherein F represents the probability density chart of the new bird group photo, F (x) represents the probability density of any point in the F chart, x represents any point in the F chart, and finally the value of C (F) is output as the counting result.
CN201910522144.6A 2019-06-17 2019-06-17 Deep learning and identification method for dense bird group Active CN110210577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910522144.6A CN110210577B (en) 2019-06-17 2019-06-17 Deep learning and identification method for dense bird group

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910522144.6A CN110210577B (en) 2019-06-17 2019-06-17 Deep learning and identification method for dense bird group

Publications (2)

Publication Number Publication Date
CN110210577A CN110210577A (en) 2019-09-06
CN110210577B true CN110210577B (en) 2021-01-29

Family

ID=67793085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910522144.6A Active CN110210577B (en) 2019-06-17 2019-06-17 Deep learning and identification method for dense bird group

Country Status (1)

Country Link
CN (1) CN110210577B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111259833A (en) * 2020-01-20 2020-06-09 青岛大学 Vehicle counting method based on traffic images
CN111493055A (en) * 2020-03-25 2020-08-07 深圳威阿科技有限公司 Multi-airport-collaborative airspace intelligent bird repelling system and method
CN111414870A (en) * 2020-03-25 2020-07-14 深圳威阿科技有限公司 Intelligent bird repelling system and method for airport airspace
CN111709374B (en) * 2020-06-18 2023-06-27 深圳市赛为智能股份有限公司 Bird condition detection method, bird condition detection device, computer equipment and storage medium
CN116935310A (en) * 2023-07-13 2023-10-24 百鸟数据科技(北京)有限责任公司 Real-time video monitoring bird density estimation method and system based on deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1986157A2 (en) * 2007-04-27 2008-10-29 DreamWorks Animation LLC Placing skin-attached features on a computer generated character
CN104778447A (en) * 2015-03-23 2015-07-15 中国民航大学 Grid unit characteristic-based crowd massing behavior detection method
CN106650913A (en) * 2016-12-31 2017-05-10 中国科学技术大学 Deep convolution neural network-based traffic flow density estimation method
CN108537818A (en) * 2018-03-07 2018-09-14 上海交通大学 Crowd's trajectory predictions method based on cluster pressure LSTM

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7853042B2 (en) * 2006-01-11 2010-12-14 Siemens Corporation Tunable kernels for tracking
WO2015066338A1 (en) * 2013-10-30 2015-05-07 St. Petersburg State University Visualization, sharing and analysis of large data sets
CN104077613B (en) * 2014-07-16 2017-04-12 电子科技大学 Crowd density estimation method based on cascaded multilevel convolution neural network
CN104992223B (en) * 2015-06-12 2018-02-16 安徽大学 Intensive Population size estimation method based on deep learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1986157A2 (en) * 2007-04-27 2008-10-29 DreamWorks Animation LLC Placing skin-attached features on a computer generated character
CN104778447A (en) * 2015-03-23 2015-07-15 中国民航大学 Grid unit characteristic-based crowd massing behavior detection method
CN106650913A (en) * 2016-12-31 2017-05-10 中国科学技术大学 Deep convolution neural network-based traffic flow density estimation method
CN108537818A (en) * 2018-03-07 2018-09-14 上海交通大学 Crowd's trajectory predictions method based on cluster pressure LSTM

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
An evaluation of crowd counting methods, features and regression models;RYAN D等;《Computer Vision and Image Understanding》;20151231;全文 *
改进的基于卷积神经网络的人数估计方法;张红颖等;《激光与光电子学进展》;20181231;全文 *

Also Published As

Publication number Publication date
CN110210577A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
CN110210577B (en) Deep learning and identification method for dense bird group
CN109872285B (en) Retinex low-illumination color image enhancement method based on variational constraint
CN105740945B (en) A kind of people counting method based on video analysis
CN108562589A (en) A method of magnetic circuit material surface defect is detected
WO2020125057A1 (en) Livestock quantity identification method and apparatus
CN104240264B (en) The height detection method and device of a kind of moving object
CN109886161B (en) Road traffic identification recognition method based on likelihood clustering and convolutional neural network
CN108388905B (en) A kind of Illuminant estimation method based on convolutional neural networks and neighbourhood context
CN112598713A (en) Offshore submarine fish detection and tracking statistical method based on deep learning
CN103927016A (en) Real-time three-dimensional double-hand gesture recognition method and system based on binocular vision
CN108764242A (en) Off-line Chinese Character discrimination body recognition methods based on deep layer convolutional neural networks
CN110288033B (en) Sugarcane top feature identification and positioning method based on convolutional neural network
CN106951869A (en) A kind of live body verification method and equipment
CN111738279B (en) Non-contact type automatic acquisition device and method for fish morphological phenotype
CN112750106A (en) Nuclear staining cell counting method based on incomplete marker deep learning, computer equipment and storage medium
CN105654085A (en) Image technology-based bullet hole recognition method
CN108256481A (en) A kind of pedestrian head detection method using body context
CN111709305B (en) Face age identification method based on local image block
CN109003287A (en) Image partition method based on improved adaptive GA-IAGA
CN111968081A (en) Fish shoal automatic counting method and device, electronic equipment and storage medium
CN115311316A (en) Small watermelon identification and positioning method in three-dimensional cultivation mode based on deep learning
CN111325181B (en) State monitoring method and device, electronic equipment and storage medium
CN111160107B (en) Dynamic region detection method based on feature matching
CN110348344B (en) Special facial expression recognition method based on two-dimensional and three-dimensional fusion
CN110111239B (en) Human image head background blurring method based on tof camera soft segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant