CN113159183A - Micro-pest image identification method based on local dense area density feature detection - Google Patents

Micro-pest image identification method based on local dense area density feature detection Download PDF

Info

Publication number
CN113159183A
CN113159183A CN202110440782.0A CN202110440782A CN113159183A CN 113159183 A CN113159183 A CN 113159183A CN 202110440782 A CN202110440782 A CN 202110440782A CN 113159183 A CN113159183 A CN 113159183A
Authority
CN
China
Prior art keywords
pest
dense
network
region
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110440782.0A
Other languages
Chinese (zh)
Other versions
CN113159183B (en
Inventor
王儒敬
杜健铭
陈天娇
谢成军
张洁
李�瑞
陈红波
胡海瀛
刘海云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Institutes of Physical Science of CAS
Original Assignee
Hefei Institutes of Physical Science of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN202110440782.0A priority Critical patent/CN113159183B/en
Publication of CN113159183A publication Critical patent/CN113159183A/en
Application granted granted Critical
Publication of CN113159183B publication Critical patent/CN113159183B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a tiny pest image identification method based on local dense region density feature detection, which overcomes the defect of low tiny pest identification rate compared with the prior art. The invention comprises the following steps: acquiring a training image; constructing a pest dense area detection network; training a pest dense area detection network; standardizing pest dense areas; constructing and training a local area pest target detection network group; constructing and training a global pest target detection network; fusing pest detection results; acquiring an image of the pest to be detected; and obtaining a pest image detection result. According to the invention, the density characteristic information of the tiny pest gathering area is utilized to accurately divide the dense area and carry out individual pest target detection, so that the problems of detection omission, low detection precision and the like of global pest target detection in the area are solved, and the overall detection precision of tiny pest image detection is improved.

Description

Micro-pest image identification method based on local dense area density feature detection
Technical Field
The invention relates to the technical field of image processing, in particular to a tiny pest image identification method based on local dense region density feature detection.
Background
Most of traditional agricultural pest forecasting methods are based on manual field investigation to identify and estimate quantity, the identification accuracy is influenced by professional knowledge of investigators, and quantity estimation is influenced by subjective judgment of the investigators, so that the forecasting results have large difference. In recent years, pest identification and detection algorithms based on machine vision and image processing technologies are applied to agricultural pest identification and detection work in a large quantity, so that the labor cost of field investigation is greatly reduced, and the accuracy of identification and counting is improved.
In practical application, although the existing target detection algorithm has a good performance in detecting pests with large size and high identification degree, the existing target detection algorithm has the problems of large detection omission, poor detection precision and the like in the case of some pests with small size and high concentration density, such as wheat aphids and the like. This is because the global target detection algorithm for the entire image has a low detection resolution, and it is difficult to resolve a minute target. If the detection resolution of the global target detection algorithm is directly improved, the calculation burden of the algorithm is greatly increased, a large amount of calculation resources are occupied, and the actual application requirements cannot be met.
Therefore, how to improve the detection of the tiny pests while ensuring the operation efficiency becomes a technical problem which needs to be solved urgently by the tiny pest detection task.
Disclosure of Invention
The invention aims to solve the defect of low micro-pest identification rate in the prior art, and provides a micro-pest image identification method based on local dense area density feature detection to solve the problem.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a tiny pest image identification method based on local dense region density feature detection comprises the following steps:
11) acquisition of training images: acquiring a pest image data set with an artificial mark;
12) constructing a pest dense area detection network: constructing a pest dense area detection network, wherein the pest dense area detection network comprises an overall feature extraction network and a dense area suggestion network;
13) training of a pest dense area detection network: training a pest dense area detection network by using a pest image data set;
14) standardization of pest dense areas: performing standardized operation on a local pest region output by the pest dense region detection network; the standardized operation of the pest dense region comprises pest dense region merging operation and pest dense region segmentation operation, pest local regions output by a pest dense region detection network are input, and standardized image local regions which are grouped according to density scores and are similar in size are output;
15) constructing and training a local area pest target detection network group: constructing and training a local area pest target detection network group; inputting image local areas which are obtained by standardizing pest dense areas and grouped according to density scores, and outputting pest identification and positioning results in the image local areas which are grouped according to the density scores;
16) constructing and training a global pest target detection network;
17) and (3) fusing pest detection results: fusing the pest identification and positioning results output by the global pest target detection network and the local region pest target detection network group to obtain a global pest identification and positioning result;
18) obtaining an image of the pest to be detected: acquiring a tiny pest image to be detected;
19) and obtaining a pest image detection result.
The construction of the pest dense area detection network comprises the following steps:
21) setting an overall feature extraction network: the overall feature extraction network comprises a backbone network and a feature fusion network; the backbone network consists of a plurality of layers of convolutional neural network layers, a pooling layer and an activation function layer which are superposed and is used for extracting basic features in the picture and outputting a plurality of layers of feature maps; the feature fusion network fuses the feature maps of all layers by laterally connecting the multi-layer feature maps output by the backbone network, and outputs an overall feature map considering different levels of semantic features; wherein the backbone network is a ResNet50 network, and the feature fusion network is an FPN feature pyramid network;
22) setting a dense area proposal network: setting the input of the dense area suggestion network as an overall characteristic graph output by an overall characteristic extraction network, and outputting the overall characteristic graph as a density score of a selected area which takes each anchor point as the center;
the dense area proposal network firstly uses a convolution layer with convolution kernel size of 3 multiplied by 3 and provided with 512 channels, then uses a linear rectification function ReLu as a convolution layer activation function, and then uses a convolution layer with convolution kernel size of 1 multiplied by 1, and the number S multiplied by R of the area shape number S and the area magnification ratio number is used for determining the channel number S multiplied by R of the convolution layer.
The training of the pest dense area detection network comprises the following steps:
31) inputting a pest image data set with artificial labels into an overall feature extraction network, extracting an image basic feature map through a backbone network, and outputting an overall feature map after multi-layer semantics are mutually fused through a feature fusion network by the basic feature map;
32) the global feature map is input into the dense area proposal network,
setting an anchor point A sliding on the characteristic diagram, setting the single sliding step length as k, taking the anchor point as the center, sharing S-shaped region selection frames, wherein each selection frame has R amplification ratios, and when the anchor point slides to the ith position, the number of the artificial marks contained in the current S-shaped selection frame under the R amplification ratio is the target number
Figure BDA0003034976500000031
The area of the current region marquee is
Figure BDA0003034976500000032
The target density score in the current marquee is represented using the following formula:
Figure BDA0003034976500000033
wherein O is a deviation compensation coefficient to ensure that the target density score is a positive number, and O is 10 in application, and the maximum value d of the target density score is setmax4, minimum value dmin=1;
Setting a target density score for a current marquee
Figure BDA0003034976500000034
For the real density score, the score output by the network through the convolutional layer according to the global feature map is set
Figure BDA0003034976500000035
Predicting a score for the target density of the current marquee; the loss function generated by the current image for the dense area detection network back propagation training is represented using the following formula:
Figure BDA0003034976500000036
wherein I is a figureThe number of anchor point positions in the image,
Figure BDA0003034976500000037
the loss function for each vote box is calculated from the smooth L1 norm SmoothL 1:
Figure BDA0003034976500000038
finally, the pest dense area detection network obtained through training outputs a series of candidate areas for each image
Figure BDA0003034976500000041
Corresponding prediction density score
Figure BDA0003034976500000042
The candidate regions with high density scores are dense regions.
The pest dense area standardization comprises the following steps:
41) merging the candidate regions with similar density scores and highly overlapped regions, inputting a dense region set and a corresponding density score output by a dense region detection network, and outputting the merged dense region set and the corresponding density score; the dense area merging step is as follows:
411) the density scores were divided into 5 groups according to the score, and the scores were as follows:
score 1-2 is no dense, score 2-2.5 is sparse, score 2.5-3 is moderately dense, score 3-3.5 is generally dense, score 3.5-4 is extremely dense;
inputting the agricultural tiny pest image with the artificial mark into the trained pest dense area detection network to obtain the local pest area
Figure BDA0003034976500000043
Wherein each dense region is represented using top-left and bottom-right coordinates
Figure BDA0003034976500000044
412) Setting an overlap degree calculation formula to calculate the overlap degree OL (a, b) of the two dense areas a and b, wherein the calculation formula is as follows:
Figure BDA0003034976500000045
and setting a synthesis threshold NtJudging whether the overlapping degree of the two dense areas a and b needs to be synthesized; if OL (a, b) is greater than threshold NtThen merging the dense areas a and b;
413) setting merge operations
Figure BDA0003034976500000046
Input of it
Figure BDA0003034976500000047
In order to collect the regions to be merged,
Figure BDA0003034976500000048
outputting { b ', d' } as the new region to be merged and the corresponding density score,
Figure BDA0003034976500000049
of the smallest
Figure BDA00030349765000000410
And
Figure BDA00030349765000000411
and maximum
Figure BDA00030349765000000412
And
Figure BDA00030349765000000413
as the upper left and lower right coordinates of the synthesized region b'; the density score corresponding to the synthesized region is
Figure BDA00030349765000000414
Minimum value of (1), is noted
Figure BDA00030349765000000415
414) From dense areas
Figure BDA00030349765000000416
Taking out the corresponding density score
Figure BDA00030349765000000417
Maximum area bkAnd go through
Figure BDA00030349765000000418
All other areas b iniPerforming overlap calculation OL (b)k,bi) If the degree of overlap is greater than the composition threshold NtAnd corresponding to the density score dkAnd diBelonging to the same density group, then biAnd diFetch-put to merge candidate set
Figure BDA0003034976500000051
Performing the following steps; if the candidate set is not empty after the traversal is finished, the currently selected area b is selectedkAnd dkAlso put into the candidate set for merging operation
Figure BDA0003034976500000052
And put the output b ', d' back
Figure BDA0003034976500000053
And
Figure BDA0003034976500000054
otherwise, will bk,dkPut into output set
Figure BDA0003034976500000055
Performing the following steps; the above operations are circulated until
Figure BDA0003034976500000056
All the areas in the table are taken out;
415) collection
Figure BDA0003034976500000057
The middle is the output dense region after merging and the corresponding dense score;
42) performing segmentation operation on the oversize regions in the merged dense regions, inputting the merged dense region set and the corresponding density score output for the merging operation, and outputting the merged dense region set and the corresponding density score; the cutting operation steps are as follows:
421) setting a slicing threshold LtJudging whether the current dense area a needs to be segmented or not;
422) from
Figure BDA0003034976500000058
Taking out area biIf the region does not need to be segmented, it is put into an output set with the corresponding density score
Figure BDA0003034976500000059
Performing the following steps; otherwise, halving operation is carried out, and L is reserved at an halving boundarytA/4 overlapping region, keeping the density score of the region after segmentation unchanged from the original density score, and putting the region into an output set
Figure BDA00030349765000000510
Performing the following steps; the above operations are circulated until
Figure BDA00030349765000000511
All the areas in the table are taken out;
423) collection
Figure BDA00030349765000000512
The dense region after the output segmentation and the corresponding density score.
The construction and training of the local area pest target detection network group comprises the following steps:
51) constructing a plurality of groups of parallel local area pest target detection networks to form a local area pest target detection network group; the input of each group of local area pest target detection network is a local area which is obtained by standardizing a pest dense area and is grouped in a corresponding density mode, and the output is an identification and positioning result of the pest target in the grouped local area; the construction of each group of local area pest target detection network comprises the following steps:
511) setting a pest feature extraction network; the pest feature extraction network is used for extracting an input pest feature map of a local region, inputting a dense region obtained by standardizing a pest region, and outputting the pest feature map based on the dense region;
512) setting a pest identification and positioning network; the pest identification and positioning network is used for automatically learning pest characteristic diagrams and identifying and positioning pests, inputting the obtained pest characteristic diagrams of the dense area, and outputting pest type identification and positioning results of the dense area;
52) training a plurality of groups of parallel local area pest target detection networks by using the obtained standardized pest dense areas; the training comprises the following steps:
521) selecting three groups of sparse, medium dense, general and extremely dense according to the density score of the input region, and grouping the normalized dense regions according to the corresponding dense scores;
522) and inputting each grouped local dense region into the corresponding grouped local region pest target detection network for training to obtain the trained local region pest target detection network.
The construction and training of the global pest target detection network comprises the following steps:
61) constructing a global pest target detection network comprising an overall feature extraction network and a pest target identification and positioning network;
62) setting an overall characteristic extraction network for extracting a characteristic graph in the whole input picture, wherein the input picture is a tiny pest picture, and the output picture is an overall characteristic graph obtained based on the whole pest picture;
63) setting a pest target identification and positioning network for automatically learning an integral characteristic diagram and detecting pest targets, inputting the integral characteristic diagram, and outputting a pest identification result and a positioning result;
64) and training the global pest target detection network.
The pest image detection result obtaining method comprises the following steps:
71) inputting an agricultural tiny pest image to be detected into a trained pest dense area detection network to obtain a pest local area;
72) standardizing pest dense areas in pest local areas to generate standardized pest dense areas;
73) dividing pest dense areas into groups according to corresponding density scores, and respectively inputting corresponding target detection networks in the trained local area pest target detection network groups to generate local area pest target detection results of the corresponding density groups;
74) inputting an agricultural tiny pest image to be detected into the trained global pest target detection network to obtain a global pest target detection result;
75) and (3) fusion of pest target detection results: and fusing the global pest target detection result and the local region pest target detection result to obtain a final target detection result.
Advantageous effects
Compared with the prior art, the method for identifying the tiny pest image based on the density characteristic detection of the local dense area accurately divides the dense area and performs the independent pest target detection by using the density characteristic information of the tiny pest gathering area, solves the problems of detection omission, low detection precision and the like of the global pest target detection in the area, and improves the overall detection precision of the tiny pest image detection.
Drawings
FIG. 1 is a sequence diagram of the method of the present invention;
fig. 2-5 are graphs showing the image recognition result of the pest according to the method of the present invention.
Detailed Description
So that the manner in which the above recited features of the present invention can be understood and readily understood, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings, wherein:
as shown in fig. 1, the method for identifying a tiny pest image based on the density feature detection of a local dense region according to the present invention includes the following steps:
firstly, acquiring a training image: a pest image dataset with artificial markers is acquired.
Secondly, constructing a pest dense area detection network: and constructing a pest dense area detection network, wherein the pest dense area detection network comprises an overall feature extraction network and a dense area suggestion network. The overall characteristic extraction network is used for extracting a characteristic diagram of pests in the whole image, the network inputs the agricultural tiny pest image and outputs the overall characteristic diagram extracted based on the pest image; and the dense region suggestion network predicts the pest dense region and the density degree according to the overall depth feature map, inputs the overall feature map and outputs the dense region and the density score corresponding to each region.
The pest dense region detection network locates pest dense distribution regions in the pest picture and outputs the dense regions to a subsequent local pest target identification and positioning network for individual detection. In the process, the resolution ratio of the tiny pest targets in the local area is increased, the difficulty in identifying and positioning the tiny pests by a pest target identification and positioning network is reduced, and the identification, positioning and detection performance of the tiny pest targets by the identification and positioning network is finally improved. The difficulty lies in the accurate resolution of dense region targets in the overall feature map and the correct prediction of region density scores. When training is insufficient, the network has the problems that dense region selection is not accurate, and the region density score is not in accordance with the reality seriously. The method comprises the following specific steps:
(1) setting an overall feature extraction network: the overall feature extraction network comprises a backbone network and a feature fusion network; the backbone network consists of a plurality of layers of convolutional neural network layers, a pooling layer and an activation function layer which are superposed and is used for extracting basic features in the picture and outputting a plurality of layers of feature maps; the feature fusion network fuses the feature maps of all layers by laterally connecting the multi-layer feature maps output by the backbone network, and outputs an overall feature map considering different levels of semantic features; the backbone network is a ResNet50 network, and the feature fusion network is an FPN feature pyramid network.
(2) Setting a dense area proposal network: setting the input of the dense area suggestion network as an overall characteristic graph output by an overall characteristic extraction network, and outputting the overall characteristic graph as a density score of a selected area which takes each anchor point as the center;
the dense area proposal network firstly uses a convolution layer with convolution kernel size of 3 multiplied by 3 and provided with 512 channels, then uses a linear rectification function ReLu as a convolution layer activation function, and then uses a convolution layer with convolution kernel size of 1 multiplied by 1, and the number S multiplied by R of the area shape number S and the area magnification ratio number is used for determining the channel number S multiplied by R of the convolution layer.
Thirdly, training a pest dense area detection network: and training the pest dense region detection network by using the pest image data set. The dense region suggestion network in the pest dense region detection network is used as a basis for network training according to the target density score in the marquee. In other prior art, the initial target detection result of the whole image is obtained mainly by initial detection, and then the dense area in the image is selected by methods such as clustering or thermodynamic diagram according to the result. Compared with other prior art, the method has the advantages that the judgment on the density degree of the area is more direct and accurate, the density score is considered to select the target quantity and the area size in the area, and the calculation burden is less. The technical difficulty is that the target density score contains complex information, and a large amount of density region information is needed to be used as a training sample in order to obtain an accurate density score prediction result. The method comprises the following specific steps:
(1) inputting a pest image data set with artificial labels into an overall feature extraction network, extracting an image basic feature map through a backbone network, and outputting an overall feature map after multi-layer semantics are mutually fused through a feature fusion network by the basic feature map;
(2) the global feature map is input into the dense area proposal network,
setting an anchor point A sliding on the characteristic diagram, setting the single sliding step length as k, taking the anchor point as the center, sharing S-shaped region selection frames, wherein each selection frame has R amplification ratios, and when the anchor point slides to the ith position, the number of the artificial marks contained in the current S-shaped selection frame under the R amplification ratio is the target number
Figure BDA0003034976500000091
The area of the current region marquee is
Figure BDA0003034976500000092
The target density score in the current marquee is represented using the following formula:
Figure BDA0003034976500000093
wherein O is a deviation compensation coefficient to ensure that the target density score is a positive number, and O is 10 in application, and the maximum value d of the target density score is setmax4, minimum value dmin=1;
Setting a target density score for a current marquee
Figure BDA0003034976500000094
For the real density score, the score output by the network through the convolutional layer according to the global feature map is set
Figure BDA0003034976500000095
Predicting a score for the target density of the current marquee; the loss function generated by the current image for the dense area detection network back propagation training is represented using the following formula:
Figure BDA0003034976500000096
wherein I is the number of anchor point positions in the image,
Figure BDA0003034976500000097
the loss function for each vote box is calculated from the smooth L1 norm SmoothL 1:
Figure BDA0003034976500000098
finally, the pest dense area detection network obtained through training outputs a series of candidate areas for each image
Figure BDA0003034976500000099
Corresponding prediction density score
Figure BDA00030349765000000910
The candidate regions with high density scores are dense regions.
Fourthly, standardizing the pest dense area: performing standardized operation on a local pest region output by the pest dense region detection network; the standardized operation of the pest dense region comprises a merging operation of the pest dense region and a splitting operation of the pest dense region, a pest local region output by a pest dense region detection network is input, and standardized image local regions which are grouped according to density scores and have similar sizes are output.
The pest dense areas are standardized, and the highly overlapped areas are combined, so that the subsequent target detection calculation burden is reduced; meanwhile, the overlarge regions are grouped according to the density and segmented, the standardized regions with similar density and size are finally obtained, and the problems of difficult training and insufficient precision of a subsequent detection network caused by uneven density and large size span are solved;
the design difficulty of the method lies in that the predicted density score of the dense area network is effectively combined, reasonable score grouping basis is used, and a merging threshold value and a segmentation threshold value with proper size are set, so that the optimal effect can be achieved only by confirming the parameters through a large number of experiments; the method comprises the following specific steps:
(1) merging the candidate regions with similar density scores and highly overlapped regions, inputting a dense region set and a corresponding density score output by a dense region detection network, and outputting the merged dense region set and the corresponding density score; the dense area merging step is as follows:
A1) the density scores were divided into 5 groups according to the score, and the scores were as follows:
score 1-2 is no dense, score 2-2.5 is sparse, score 2.5-3 is moderately dense, score 3-3.5 is generally dense, score 3.5-4 is extremely dense;
inputting the agricultural tiny pest image with the artificial mark into the trained pest dense area detection network to obtain the local pest area
Figure BDA0003034976500000101
Wherein each dense region is represented using top-left and bottom-right coordinates
Figure BDA0003034976500000102
A2) Setting an overlap degree calculation formula to calculate the overlap degree OL (a, b) of the two dense areas a and b, wherein the calculation formula is as follows:
Figure BDA0003034976500000103
and setting a synthesis threshold NtJudging whether the overlapping degree of the two dense areas a and b needs to be synthesized; if OL (a, b) is greater than threshold NtThen merging the dense areas a and b;
A3) setting merge operations
Figure BDA0003034976500000104
Input of it
Figure BDA0003034976500000105
In order to collect the regions to be merged,
Figure BDA0003034976500000106
outputting { b ', d' } as new combined density score set for the region needing to be combinedThe region and the corresponding density score,
Figure BDA0003034976500000107
of the smallest
Figure BDA0003034976500000108
And
Figure BDA0003034976500000109
and maximum
Figure BDA00030349765000001010
And
Figure BDA00030349765000001011
as the upper left and lower right coordinates of the synthesized region b'; the density score corresponding to the synthesized region is
Figure BDA00030349765000001012
Minimum value of (1), is noted
Figure BDA00030349765000001013
A4) From dense areas
Figure BDA00030349765000001014
Taking out the corresponding density score
Figure BDA00030349765000001015
Maximum area bkAnd go through
Figure BDA00030349765000001016
All other areas b iniPerforming overlap calculation OL (b)k,bi) If the degree of overlap is greater than the composition threshold NtAnd corresponding to the density score dkAnd diBelonging to the same density group, then biAnd diFetch-put to merge candidate set
Figure BDA0003034976500000111
Performing the following steps; if the candidate set is not empty after the traversal is finished, the currently selected area b is selectedkAnd dkAlso put into the candidate set for merging operation
Figure BDA0003034976500000112
And put the output b ', d' back
Figure BDA0003034976500000113
And
Figure BDA0003034976500000114
otherwise, will bk,dkPut into output set
Figure BDA0003034976500000115
Performing the following steps; the above operations are circulated until
Figure BDA0003034976500000116
All the areas in the table are taken out;
A5) collection
Figure BDA0003034976500000117
The middle is the output dense region after merging and the corresponding dense score;
(2) performing segmentation operation on the oversize regions in the merged dense regions, inputting the merged dense region set and the corresponding density score output for the merging operation, and outputting the merged dense region set and the corresponding density score; the cutting operation steps are as follows:
B1) setting a slicing threshold LtJudging whether the current dense area a needs to be segmented or not;
B2) from
Figure BDA0003034976500000118
Taking out area biIf the region does not need to be segmented, it is put into an output set with the corresponding density score
Figure BDA0003034976500000119
Performing the following steps; whether or notThen a halving operation is performed and L is retained at the halving boundarytA/4 overlapping region, keeping the density score of the region after segmentation unchanged from the original density score, and putting the region into an output set
Figure BDA00030349765000001110
Performing the following steps; the above operations are circulated until
Figure BDA00030349765000001111
All the areas in the table are taken out;
B3) collection
Figure BDA00030349765000001112
The dense region after the output segmentation and the corresponding density score.
Fifthly, constructing and training a local area pest target detection network group: constructing and training a local area pest target detection network group; the image local areas grouped according to the density scores and obtained through pest dense area standardization are input, and pest identification and positioning results in the image local areas grouped according to the density scores are output.
In the prior art, a group of local area detection networks are generally used for carrying out target identification and positioning operation in a local area, so that the group of detection networks needs to detect various density characteristics, and the detection precision is not high; the local area pest target detection network group groups local areas with various densities according to the target densities thereof, so that each group of detection networks carries out identification and positioning operation aiming at similar density areas, and the problem of detection precision reduction caused by density span is reduced; the technical difficulty lies in that the optimal grouping number and density grouping basis are set manually in effective combination with standardized operation, and the optimal parameters need to be summarized through a large number of experiments.
The construction and training of the local area pest target detection network group comprises the following steps:
(1) constructing a plurality of groups of parallel local area pest target detection networks to form a local area pest target detection network group; the input of each group of local area pest target detection network is a local area which is obtained by standardizing a pest dense area and is grouped in a corresponding density mode, and the output is an identification and positioning result of the pest target in the grouped local area; the construction of each group of local area pest target detection network comprises the following steps:
C1) setting a pest feature extraction network; the pest feature extraction network is used for extracting an input pest feature map of a local region, inputting a dense region obtained by standardizing a pest region, and outputting the pest feature map based on the dense region;
C2) setting a pest identification and positioning network; the pest identification and positioning network is used for automatically learning pest characteristic diagrams and identifying and positioning pests, inputting the obtained pest characteristic diagrams of the dense area, and outputting pest type identification and positioning results of the dense area;
(2) training a plurality of groups of parallel local area pest target detection networks by using the obtained standardized pest dense areas; the training comprises the following steps:
D1) selecting three groups of sparse, medium dense, general and extremely dense according to the density score of the input region, and grouping the normalized dense regions according to the corresponding dense scores;
D2) and inputting each grouped local dense region into the corresponding grouped local region pest target detection network for training to obtain the trained local region pest target detection network.
And sixthly, constructing and training a global pest target detection network. The global pest target detection network is used for detecting sparse pest targets in the whole picture, is used as supplement of detection results of local dense areas, finally obtains complete recognition and positioning results of all pests in the picture, and is constructed and trained according to the prior art.
Firstly, constructing a global pest target detection network, including an overall feature extraction network and a pest target identification and positioning network;
secondly, setting an overall characteristic extraction network for extracting a characteristic diagram in the whole input picture, inputting the characteristic diagram into a tiny pest picture, and outputting the characteristic diagram into an overall characteristic diagram obtained based on the whole pest picture;
thirdly, setting a pest target identification and positioning network for automatically learning the overall characteristic diagram and detecting pest targets, inputting the overall characteristic diagram, and outputting pest identification results and positioning results;
and finally, training the global pest target detection network.
Seventhly, fusing pest detection results: and fusing the pest identification and positioning results output by the global pest target detection network and the local region pest target detection network group to obtain a global pest identification and positioning result.
Eighthly, acquiring an image of the pest to be detected: and acquiring a micro pest image to be detected.
And ninthly, obtaining a pest image detection result.
(1) Inputting an agricultural tiny pest image to be detected into a trained pest dense area detection network to obtain a pest local area;
(2) standardizing pest dense areas in pest local areas to generate standardized pest dense areas;
(3) dividing pest dense areas into groups according to corresponding density scores, and respectively inputting corresponding target detection networks in the trained local area pest target detection network groups to generate local area pest target detection results of the corresponding density groups;
(4) inputting an agricultural tiny pest image to be detected into the trained global pest target detection network to obtain a global pest target detection result;
(5) and (3) fusion of pest target detection results: and fusing the global pest target detection result and the local region pest target detection result to obtain a final target detection result.
As shown in fig. 2 to 5, it can be shown by the technical method of the present invention aiming at the recognition effect of the tiny pest picture that the method of the present invention can give consideration to both the detection of the tiny pests in the dense region and the detection of the sparse pests, and compared with the prior art, the method of the present invention has the advantages of missing set detection and high precision of the detection of the dense region.
TABLE 1 comparison table of the detection results on the micro-pest data set
Method AP AP50 AP75
FCOS 22.0 61.9 8.7
RetinaNet 17.5 51.3 6.5
FasterRCNN 23.6 63.2 10.8
DMNet 24.5 64.6 12.0
The method of the invention 30.5 71.8 16.3
As shown in Table 1, the detection precision of the method of the present invention and other prior art methods on the micro-pest data set is superior to that of the prior art methods, as shown in Table 1, by using the detection precision evaluation methods AP, AP50 and AP75 which are well known in the industry.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (7)

1. A tiny pest image recognition method based on local dense region density feature detection is characterized by comprising the following steps:
11) acquisition of training images: acquiring a pest image data set with an artificial mark;
12) constructing a pest dense area detection network: constructing a pest dense area detection network, wherein the pest dense area detection network comprises an overall feature extraction network and a dense area suggestion network;
13) training of a pest dense area detection network: training a pest dense area detection network by using a pest image data set;
14) standardization of pest dense areas: performing standardized operation on a local pest region output by the pest dense region detection network; the standardized operation of the pest dense region comprises pest dense region merging operation and pest dense region segmentation operation, pest local regions output by a pest dense region detection network are input, and standardized image local regions which are grouped according to density scores and are similar in size are output;
15) constructing and training a local area pest target detection network group: constructing and training a local area pest target detection network group; inputting image local areas which are obtained by standardizing pest dense areas and grouped according to density scores, and outputting pest identification and positioning results in the image local areas which are grouped according to the density scores;
16) constructing and training a global pest target detection network;
17) and (3) fusing pest detection results: fusing the pest identification and positioning results output by the global pest target detection network and the local region pest target detection network group to obtain a global pest identification and positioning result;
18) obtaining an image of the pest to be detected: acquiring a tiny pest image to be detected;
19) and obtaining a pest image detection result.
2. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the construction of the pest dense region detection network comprises the following steps:
21) setting an overall feature extraction network: the overall feature extraction network comprises a backbone network and a feature fusion network; the backbone network consists of a plurality of layers of convolutional neural network layers, a pooling layer and an activation function layer which are superposed and is used for extracting basic features in the picture and outputting a plurality of layers of feature maps; the feature fusion network fuses the feature maps of all layers by laterally connecting the multi-layer feature maps output by the backbone network, and outputs an overall feature map considering different levels of semantic features; wherein the backbone network is a ResNet50 network, and the feature fusion network is an FPN feature pyramid network;
22) setting a dense area proposal network: setting the input of the dense area suggestion network as an overall characteristic graph output by an overall characteristic extraction network, and outputting the overall characteristic graph as a density score of a selected area which takes each anchor point as the center;
the dense area proposal network firstly uses a convolution layer with convolution kernel size of 3 multiplied by 3 and provided with 512 channels, then uses a linear rectification function ReLu as a convolution layer activation function, and then uses a convolution layer with convolution kernel size of 1 multiplied by 1, and the number S multiplied by R of the area shape number S and the area magnification ratio number is used for determining the channel number S multiplied by R of the convolution layer.
3. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the training of the pest dense region detection network comprises the following steps:
31) inputting a pest image data set with artificial labels into an overall feature extraction network, extracting an image basic feature map through a backbone network, and outputting an overall feature map after multi-layer semantics are mutually fused through a feature fusion network by the basic feature map;
32) the global feature map is input into the dense area proposal network,
setting an anchor point A sliding on the characteristic diagram, setting the single sliding step length as k, taking the anchor point as the center, sharing S-shaped region selection frames, wherein each selection frame has R amplification ratios, and when the anchor point slides to the ith position, the number of the artificial marks contained in the current S-shaped selection frame under the R amplification ratio is the target number
Figure FDA0003034976490000021
The area of the current region marquee is
Figure FDA0003034976490000022
The target density score in the current marquee is represented using the following formula:
Figure FDA0003034976490000023
wherein O is a deviation compensation coefficient to ensure that the target density score is a positive number, and O is 10 in application, and the maximum value d of the target density score is setmax4, minimum value dmin=1;
Setting the target density of the current marqueeIs divided into
Figure FDA0003034976490000024
For the real density score, the score output by the network through the convolutional layer according to the global feature map is set
Figure FDA0003034976490000025
Predicting a score for the target density of the current marquee; the loss function generated by the current image for the dense area detection network back propagation training is represented using the following formula:
Figure FDA0003034976490000026
wherein I is the number of anchor point positions in the image,
Figure FDA0003034976490000027
the loss function for each vote box is calculated from the smooth L1 norm SmoothL 1:
Figure FDA0003034976490000031
finally, the pest dense area detection network obtained through training outputs a series of candidate areas for each image
Figure FDA0003034976490000032
Corresponding prediction density score
Figure FDA0003034976490000033
The candidate regions with high density scores are dense regions.
4. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the pest dense region standardization comprises the following steps:
41) merging the candidate regions with similar density scores and highly overlapped regions, inputting a dense region set and a corresponding density score output by a dense region detection network, and outputting the merged dense region set and the corresponding density score; the dense area merging step is as follows:
411) the density scores were divided into 5 groups according to the score, and the scores were as follows:
score 1-2 is no dense, score 2-2.5 is sparse, score 2.5-3 is moderately dense, score 3-3.5 is generally dense, score 3.5-4 is extremely dense;
inputting the agricultural tiny pest image with the artificial mark into the trained pest dense area detection network to obtain the local pest area
Figure FDA0003034976490000034
Wherein each dense region is represented using top-left and bottom-right coordinates
Figure FDA0003034976490000035
412) Setting an overlap degree calculation formula to calculate the overlap degree OL (a, b) of the two dense areas a and b, wherein the calculation formula is as follows:
Figure FDA0003034976490000036
and setting a synthesis threshold NtJudging whether the overlapping degree of the two dense areas a and b needs to be synthesized; if OL (a, b) is greater than threshold NtThen merging the dense areas a and b;
413) setting merge operations
Figure FDA0003034976490000037
Input of it
Figure FDA0003034976490000038
In order to collect the regions to be merged,
Figure FDA0003034976490000039
outputting { b ', d' } as the new region to be merged and the corresponding density score,
Figure FDA00030349764900000310
of the smallest
Figure FDA00030349764900000311
And
Figure FDA00030349764900000312
and maximum
Figure FDA00030349764900000313
And
Figure FDA00030349764900000314
as the upper left and lower right coordinates of the synthesized region b'; the density score corresponding to the synthesized region is
Figure FDA0003034976490000041
Minimum value of (1), is noted
Figure FDA0003034976490000042
414) From dense areas
Figure FDA0003034976490000043
Taking out the corresponding density score
Figure FDA0003034976490000044
Maximum area bkAnd go through
Figure FDA0003034976490000045
All other areas b iniPerforming overlap calculation OL (b)k,bi) If the degree of overlap is greater thanComposite threshold NtAnd corresponding to the density score dkAnd diBelonging to the same density group, then biAnd diFetch-put to merge candidate set
Figure FDA0003034976490000046
Performing the following steps; if the candidate set is not empty after the traversal is finished, the currently selected area b is selectedkAnd dkAlso put into the candidate set for merging operation
Figure FDA0003034976490000047
And put the output b ', d' back
Figure FDA0003034976490000048
And
Figure FDA0003034976490000049
otherwise, will bk,dkPut into output set
Figure FDA00030349764900000410
Performing the following steps; the above operations are circulated until
Figure FDA00030349764900000411
All the areas in the table are taken out;
415) collection
Figure FDA00030349764900000412
The middle is the output dense region after merging and the corresponding dense score;
42) performing segmentation operation on the oversize regions in the merged dense regions, inputting the merged dense region set and the corresponding density score output for the merging operation, and outputting the merged dense region set and the corresponding density score; the cutting operation steps are as follows:
421) setting a slicing threshold LtJudging whether the current dense area a needs to be segmented or not;
422) from
Figure FDA00030349764900000413
Taking out area biIf the region does not need to be segmented, it is put into an output set with the corresponding density score
Figure FDA00030349764900000414
Performing the following steps; otherwise, halving operation is carried out, and L is reserved at an halving boundarytA/4 overlapping region, keeping the density score of the region after segmentation unchanged from the original density score, and putting the region into an output set
Figure FDA00030349764900000415
Performing the following steps; the above operations are circulated until
Figure FDA00030349764900000416
All the areas in the table are taken out;
423) collection
Figure FDA00030349764900000417
The dense region after the output segmentation and the corresponding density score.
5. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the construction and training of the local region pest target detection network group comprises the following steps:
51) constructing a plurality of groups of parallel local area pest target detection networks to form a local area pest target detection network group; the input of each group of local area pest target detection network is a local area which is obtained by standardizing a pest dense area and is grouped in a corresponding density mode, and the output is an identification and positioning result of the pest target in the grouped local area; the construction of each group of local area pest target detection network comprises the following steps:
511) setting a pest feature extraction network; the pest feature extraction network is used for extracting an input pest feature map of a local region, inputting a dense region obtained by standardizing a pest region, and outputting the pest feature map based on the dense region;
512) setting a pest identification and positioning network; the pest identification and positioning network is used for automatically learning pest characteristic diagrams and identifying and positioning pests, inputting the obtained pest characteristic diagrams of the dense area, and outputting pest type identification and positioning results of the dense area;
52) training a plurality of groups of parallel local area pest target detection networks by using the obtained standardized pest dense areas; the training comprises the following steps:
521) selecting three groups of sparse, medium dense, general and extremely dense according to the density score of the input region, and grouping the normalized dense regions according to the corresponding dense scores;
522) and inputting each grouped local dense region into the corresponding grouped local region pest target detection network for training to obtain the trained local region pest target detection network.
6. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the construction and training of the global pest target detection network comprises the following steps:
61) constructing a global pest target detection network comprising an overall feature extraction network and a pest target identification and positioning network;
62) setting an overall characteristic extraction network for extracting a characteristic graph in the whole input picture, wherein the input picture is a tiny pest picture, and the output picture is an overall characteristic graph obtained based on the whole pest picture;
63) setting a pest target identification and positioning network for automatically learning an integral characteristic diagram and detecting pest targets, inputting the integral characteristic diagram, and outputting a pest identification result and a positioning result;
64) and training the global pest target detection network.
7. The method for identifying the tiny pests based on the local dense region density feature detection as claimed in claim 1, wherein the obtaining of the pest image detection result comprises the following steps:
71) inputting an agricultural tiny pest image to be detected into a trained pest dense area detection network to obtain a pest local area;
72) standardizing pest dense areas in pest local areas to generate standardized pest dense areas;
73) dividing pest dense areas into groups according to corresponding density scores, and respectively inputting corresponding target detection networks in the trained local area pest target detection network groups to generate local area pest target detection results of the corresponding density groups;
74) inputting an agricultural tiny pest image to be detected into the trained global pest target detection network to obtain a global pest target detection result;
75) and (3) fusion of pest target detection results: and fusing the global pest target detection result and the local region pest target detection result to obtain a final target detection result.
CN202110440782.0A 2021-04-23 2021-04-23 Tiny pest image identification method based on local dense area density feature detection Active CN113159183B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110440782.0A CN113159183B (en) 2021-04-23 2021-04-23 Tiny pest image identification method based on local dense area density feature detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110440782.0A CN113159183B (en) 2021-04-23 2021-04-23 Tiny pest image identification method based on local dense area density feature detection

Publications (2)

Publication Number Publication Date
CN113159183A true CN113159183A (en) 2021-07-23
CN113159183B CN113159183B (en) 2022-08-30

Family

ID=76870091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110440782.0A Active CN113159183B (en) 2021-04-23 2021-04-23 Tiny pest image identification method based on local dense area density feature detection

Country Status (1)

Country Link
CN (1) CN113159183B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111178121A (en) * 2018-12-25 2020-05-19 中国科学院合肥物质科学研究院 Pest image positioning and identifying method based on spatial feature and depth feature enhancement technology
WO2020102988A1 (en) * 2018-11-20 2020-05-28 西安电子科技大学 Feature fusion and dense connection based infrared plane target detection method
JP2020091543A (en) * 2018-12-03 2020-06-11 キヤノン株式会社 Learning device, processing device, neural network, learning method, and program
CN111460315A (en) * 2020-03-10 2020-07-28 平安科技(深圳)有限公司 Social portrait construction method, device and equipment and storage medium
CN111476238A (en) * 2020-04-29 2020-07-31 中国科学院合肥物质科学研究院 Pest image detection method based on regional scale perception technology
CN111476317A (en) * 2020-04-29 2020-07-31 中国科学院合肥物质科学研究院 Plant protection image non-dense pest detection method based on reinforcement learning technology
US20200250461A1 (en) * 2018-01-30 2020-08-06 Huawei Technologies Co., Ltd. Target detection method, apparatus, and system
CN112488244A (en) * 2020-12-22 2021-03-12 中国科学院合肥物质科学研究院 Method for automatically counting densely distributed small target pests in point labeling mode by utilizing thermodynamic diagram

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200250461A1 (en) * 2018-01-30 2020-08-06 Huawei Technologies Co., Ltd. Target detection method, apparatus, and system
WO2020102988A1 (en) * 2018-11-20 2020-05-28 西安电子科技大学 Feature fusion and dense connection based infrared plane target detection method
JP2020091543A (en) * 2018-12-03 2020-06-11 キヤノン株式会社 Learning device, processing device, neural network, learning method, and program
CN111178121A (en) * 2018-12-25 2020-05-19 中国科学院合肥物质科学研究院 Pest image positioning and identifying method based on spatial feature and depth feature enhancement technology
CN111460315A (en) * 2020-03-10 2020-07-28 平安科技(深圳)有限公司 Social portrait construction method, device and equipment and storage medium
CN111476238A (en) * 2020-04-29 2020-07-31 中国科学院合肥物质科学研究院 Pest image detection method based on regional scale perception technology
CN111476317A (en) * 2020-04-29 2020-07-31 中国科学院合肥物质科学研究院 Plant protection image non-dense pest detection method based on reinforcement learning technology
CN112488244A (en) * 2020-12-22 2021-03-12 中国科学院合肥物质科学研究院 Method for automatically counting densely distributed small target pests in point labeling mode by utilizing thermodynamic diagram

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
C. LI 等: "Density map guided object detection in aerial images", 《PROCEEDINGS OF THE IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (2020)》 *
LI R 等: "A coarse-to-fine network for aphid recognition and detection in the field", 《BIOSYSTEMS ENGINEERING》 *
LIU, LIU 等: "Deep Learning based Automatic Approach using Hybrid Global and Local Activated Features towards Large-scale Multi-class Pest Monitoring", 《2019 IEEE 17TH INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN)》 *
刘浏: "基于深度学习的农作物害虫检测方法研究与应用", 《知网博士电子期刊》 *
李国瑞等: "基于语义信息跨层特征融合的细粒度鸟类识别", 《计算机应用与软件》 *
梁延禹等: "多尺度非局部注意力网络的小目标检测算法", 《计算机科学与探索》 *

Also Published As

Publication number Publication date
CN113159183B (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN111027547B (en) Automatic detection method for multi-scale polymorphic target in two-dimensional image
CN110599448B (en) Migratory learning lung lesion tissue detection system based on MaskScoring R-CNN network
CN107977671B (en) Tongue picture classification method based on multitask convolutional neural network
CN108830326B (en) Automatic segmentation method and device for MRI (magnetic resonance imaging) image
CN106056118B (en) A kind of identification method of counting for cell
CN108288271A (en) Image detecting system and method based on three-dimensional residual error network
CN107016409A (en) A kind of image classification method and system based on salient region of image
CN112036231B (en) Vehicle-mounted video-based lane line and pavement indication mark detection and identification method
CN108629286B (en) Remote sensing airport target detection method based on subjective perception significance model
CN108305260B (en) Method, device and equipment for detecting angular points in image
CN112084869A (en) Compact quadrilateral representation-based building target detection method
CN113569724B (en) Road extraction method and system based on attention mechanism and dilation convolution
CN104657980A (en) Improved multi-channel image partitioning algorithm based on Meanshift
CN111126393A (en) Vehicle appearance refitting judgment method and device, computer equipment and storage medium
CN112287906B (en) Template matching tracking method and system based on depth feature fusion
CN110929746A (en) Electronic file title positioning, extracting and classifying method based on deep neural network
CN115546605A (en) Training method and device based on image labeling and segmentation model
CN110738132A (en) target detection quality blind evaluation method with discriminant perception capability
JP2016099835A (en) Image processor, image processing method, and program
CN112949634B (en) Railway contact net nest detection method
CN113361530A (en) Image semantic accurate segmentation and optimization method using interaction means
CN110688512A (en) Pedestrian image search algorithm based on PTGAN region gap and depth neural network
Lee et al. Enhancement for automatic extraction of RoIs for bone age assessment based on deep neural networks
CN113159183B (en) Tiny pest image identification method based on local dense area density feature detection
CN117292217A (en) Skin typing data augmentation method and system based on countermeasure generation network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant