CN116091394A - Deep learning-based insect type and number image recognition algorithm - Google Patents

Deep learning-based insect type and number image recognition algorithm Download PDF

Info

Publication number
CN116091394A
CN116091394A CN202211157102.5A CN202211157102A CN116091394A CN 116091394 A CN116091394 A CN 116091394A CN 202211157102 A CN202211157102 A CN 202211157102A CN 116091394 A CN116091394 A CN 116091394A
Authority
CN
China
Prior art keywords
image
pest
target
deep learning
recognition algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211157102.5A
Other languages
Chinese (zh)
Inventor
王志伟
吴京业
周军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Byte Data Technology Co ltd
Original Assignee
Nanjing Byte Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Byte Data Technology Co ltd filed Critical Nanjing Byte Data Technology Co ltd
Priority to CN202211157102.5A priority Critical patent/CN116091394A/en
Publication of CN116091394A publication Critical patent/CN116091394A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a deep learning-based insect type and number image recognition algorithm, which comprises the following steps: collecting pest images, and establishing a first image training data set and a second image training data set; acquiring a plurality of pest targets in an image by a multi-target positioning method; for each first image containing the pest target, dividing the first image by adopting a dividing method to obtain a second binary image; acquiring characteristic information of the pest target according to the second binary image; normalizing the characteristic information to thereby convert the pest target to a representation of the characteristic space; and classifying the standardized characteristic information through a classifier to obtain corresponding pest categories, and counting the pests in the same category, so that the identification and the quantity statistics of the pests are realized. The invention reduces the convenience and the preventive property of pest prevention work through an image recognition algorithm so as to solve the defects existing in the prior industry.

Description

Deep learning-based insect type and number image recognition algorithm
Technical Field
The invention belongs to the technical field of tobacco pest control, and particularly relates to a deep learning-based insect species and quantity image recognition algorithm.
Background
The prevention and treatment of insect pests are widely applied to industries such as traditional Chinese medicine, grain storage, tobacco industry, food processing, forest insect pest prevention, crop planting and the like.
In recent years, deep learning image recognition technology in China is rapidly developed, and at present, a plurality of companies and scholars perform pest control and pest prevention work by designing a set of detection algorithm model. However, crop pests often occupy very small sizes and proportions, pest targets are difficult to distinguish from surrounding environments, pest target information is easy to lose in a sampling process by the existing detection method, and the crop pests have large-scale characteristics, and the identification accuracy of the existing detection method is low under large-scale pest image identification and detection tasks.
Disclosure of Invention
The invention aims to provide a deep learning-based insect species and quantity image recognition algorithm so as to solve the problems in the prior art in the background art.
In order to achieve the above purpose, the invention adopts the following technical scheme:
an insect type and number image recognition algorithm based on deep learning comprises the following steps:
s1, acquiring pest images, and establishing a first image training data set (namely a characteristic recognition training set of single pests) and a second image training data set (namely a positioning training set of a plurality of target pests);
s2, acquiring a plurality of pest targets (namely, acquisition quantity) in the image through a multi-target positioning method
S3, aiming at each first image containing the pest target, segmenting the first image by adopting a segmentation method to obtain a second binary image, wherein a segmentation formula is as follows:
Figure SMS_1
s4, acquiring characteristic information of the pest target according to a second binary image, wherein the characteristic information comprises a color characteristic and a Hu invariant moment characteristic, and the color characteristic is as follows: separating RGB three components of the second binary image to obtain a histogram corresponding to the three components, and calculating to obtain statistical description of the histogram, wherein the formula is as follows:
Figure SMS_2
Figure SMS_3
Figure SMS_4
Figure SMS_5
Figure SMS_6
Figure SMS_7
s5, normalizing the characteristic information so as to convert the pest target into a representation of a characteristic space, wherein a normalization method (a minimum-maximum normalization method is also called dispersion normalization, and for original data, the original data is mapped into a fixed range from a value range through a linear transformation) is as follows:
Figure SMS_8
s6, classifying the standardized characteristic information through a classifier, so as to obtain corresponding pest categories, and counting the pests in the same category, thereby realizing the identification and the quantity statistics of the pests.
Preferably, the step S1 includes preprocessing the images before the first image training data set and the second image training data set are established, the preprocessing of the first image training set is to amplify and cut out the target pest, the preprocessing of the second image training set is to remove noise, and the denoising is specifically performed through a filter.
Preferably, the multi-target positioning in S2 is mainly a watershed marking method:
1) Binarizing the original image by an Ojin method;
2) Determining a minimum circumcircle of a maximum target area in the first binary image, and then marking pixels outside the circumcircle as a determined black background;
3) Performing ellipse fitting on the largest target area in the first binary image, drawing a circle by adopting the major axis of the ellipse as the diameter and the center as the circle center as an inscribed circle;
4) Carrying out corrosion operation on the first binary image, carrying out intersection with an inscribed circle to mark a background area, carrying out expansion operation on the first binary image, and determining a pest area through the inscribed circle;
5) After the marks of the background and the foreground are determined, the original image is directly segmented by using a watershed algorithm, and a plurality of first images containing pest targets are obtained.
Preferably, in the step S3, bw is a second binary image, g (x, y) is a central pixel point of each window, a threshold T is obtained for each pixel point by fixing gray values of all pixels in the large and small window, and then the central pixel point of the window is binarized by the threshold, so as to obtain the second binary image.
Preferably, in S4, the histogram in the formula is p (z i ) M is the mean value, sigma is the standard deviation, R is the smoothness, u3 is the normalized moment of the histogram, u is the consistency, e is the entropy, L is the gray level of the histogram, the mean value describes the color mean value of the whole target area, the standard deviation describes the non-uniformity of the color distribution, the third moment describes the skewness of the histogram, and the entropy describes the randomness of the histogram.
Preferably, in the formula of S5, Q1A represents a first quartile of the pest area, Q2A represents a second quartile of the pest area, and Q3A represents a third quartile of the pest area.
The invention has the technical effects and advantages that: compared with the prior art, the insect type and quantity image recognition algorithm based on deep learning has the following advantages:
1. the invention provides an image recognition algorithm for the number and the types of insect in a trap for deep learning, so that the judgment of the types and the numbers of the insect is not needed manually, meanwhile, the working convenience of insect pest managers is reduced, in addition, the insect pest target information is not easy to lose in the sampling process in a mode of acquiring the types and the numbers of insect pests by adopting a first image and a second image, and the recognition accuracy is greatly improved in large-scale insect pest image recognition and detection tasks;
2. the invention regards the identity of the counterpart as the counterpart's public key, so that no certification by a third party authority is required. Compared with the IBE algorithm, the method only needs to reserve a small amount of common parameters, but does not need to reserve a large amount of parameters related to users, so that the method does not need the support of databases such as a directory database (LDAP) and the like, and does not have the online maintenance of the system.
Drawings
FIG. 1 is a flow chart of a method of generating an asymmetric key pair based on a user identification in accordance with the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. The specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a deep learning-based insect type and number image recognition algorithm, which comprises the following steps:
s1, acquiring pest images, and establishing a first image training data set (namely a characteristic recognition training set of single pests) and a second image training data set (namely a positioning training set of a plurality of target pests);
s2, acquiring a plurality of pest targets (namely, acquisition quantity) in the image through a multi-target positioning method
S3, aiming at each first image containing the pest target, segmenting the first image by adopting a segmentation method to obtain a second binary image, wherein a segmentation formula is as follows:
Figure SMS_9
s4, acquiring characteristic information of the pest target according to a second binary image, wherein the characteristic information comprises a color characteristic and a Hu invariant moment characteristic, and the color characteristic is as follows: separating RGB three components of the second binary image to obtain a histogram corresponding to the three components, and calculating to obtain statistical description of the histogram, wherein the formula is as follows:
Figure SMS_10
Figure SMS_11
Figure SMS_12
Figure SMS_13
Figure SMS_14
Figure SMS_15
s5, normalizing the characteristic information so as to convert the pest target into a representation of a characteristic space, wherein a normalization method (a minimum-maximum normalization method is also called dispersion normalization, and for original data, the original data is mapped into a fixed range from a value range through a linear transformation) is as follows:
Figure SMS_16
s6, classifying the standardized characteristic information through a classifier, so as to obtain corresponding pest categories, and counting the pests in the same category, thereby realizing the identification and the quantity statistics of the pests.
The step S1 is that preprocessing is carried out on the images before the first image training data set and the second image training data set are established, the preprocessing of the first image training set is to amplify and cut out target pests, the preprocessing of the second image training set is to remove noise, and the denoising is specifically carried out through a filter.
The multi-target positioning in S2 is mainly a watershed marking method:
1) Binarizing the original image by an Ojin method;
2) Determining a minimum circumcircle of a maximum target area in the first binary image, and then marking pixels outside the circumcircle as a determined black background;
3) Performing ellipse fitting on the largest target area in the first binary image, drawing a circle by adopting the major axis of the ellipse as the diameter and the center as the circle center as an inscribed circle;
4) Carrying out corrosion operation on the first binary image, carrying out intersection with an inscribed circle to mark a background area, carrying out expansion operation on the first binary image, and determining a pest area through the inscribed circle;
5) After the marks of the background and the foreground are determined, the original image is directly segmented by using a watershed algorithm, and a plurality of first images containing pest targets are obtained.
In S3, bw is the second binary image, g (x, y) is the central pixel point of each window, and for each pixel point, a threshold T is obtained by fixing all pixel gray values in the size window, and then binarization is performed on the central pixel point of the window by the threshold, so as to obtain the second binary image.
In S4, the histogram in the formula is p (z i ) M is the mean value, sigma is the standard deviation, R is the smoothness, u3 is the normalized moment of the histogram, u is the consistency, e is the entropy, L is the gray level of the histogram, the mean value describes the color mean value of the whole target area, the standard deviation describes the non-uniformity of the color distribution, the third moment describes the skewness of the histogram, and the entropy describes the randomness of the histogram.
In the formula of S5, Q1A represents a first quartile of the pest area, Q2A represents a second quartile of the pest area, and Q3A represents a third quartile of the pest area.
The insect type and quantity image recognition algorithm based on deep learning also comprises a key generation and management method based on identification, wherein a public key calculation parameter and a private key calculation parameter are respectively established by limited public keys and private keys, and a mapping calculation method and an operation rule are created, so that each relying party can directly calculate the public key of the other party through the identification of the other party, and a key method based on identification is realized;
the method for generating an asymmetric key pair from a user identity comprises the steps of:
s1, generating a private key calculation parameter and a public key calculation parameter which correspond to each other;
s2, calculating a private key of the first user by using the private key calculation parameter according to the identification provided by the first user, and providing the generated private key for the first user;
s3, publishing the public key calculation parameters so that the second user can calculate the public key of the first user by utilizing the public key calculation parameters according to the identification of the first user after obtaining the identification of the first user;
s4, when calculating the public key and the private key, using the scope parameter with the hierarchical structure besides the public key calculation parameter and the private key calculation parameter.
And S2 and S3, when calculating the public key/private key, transforming the identifier, positioning one or more elements in the public key/private key calculation parameter matrix, and combining to obtain the public key/private key.
The first user and the second user in S2 and S3 are the same user or different users, the definition of the identification in S2 and S3 is generalized, and the user name, the user name identity card number, the telephone number, the mail address, the personal account number, the equipment serial number, the software process name and the like can be used for identification.
The private key calculation parameters in S1, S2, S3 and S4 are secret variables, are special for private key production, are stored in a Key Management Center (KMC), and are public variables, and are directly published in any media which are most accessible; because the quantity of parameters to be published is very limited, the parameters are generally directly recorded in the personal ID authentication card and are sent to users together with the private key for use, and therefore, each relying party can calculate the public key of any user through the public key calculation parameters as long as each relying party knows the identity of the other party.
Wherein, according to the definition of Elliptic Curve Cryptography (ECC) standard, an elliptic curve (cryptography) E is set: y2=x3+ax+b (m), parameter T: (a, b, G, n, m), where m is the modulus, n is the boundary, G is the base point, i.e. g= (X0, Y0). Assuming that the private key s is chosen to be any integer, the corresponding public key P is obtained as a point sG on the elliptic curve E, marked with (xs, ys).
Public key calculation base and private key calculation base public and private key calculation bases are the basis for realizing an identification-based key algorithm. The private key calculation base SCB is composed of arbitrarily selected integer variables sij, whereas the public key calculation base PCB is derived from the private key calculation base according to the elliptic curve cryptography principle described above, i.e. sijg= (xij, yij) = Pij. And forming a one-to-one correspondence between the private key calculation base and the public key calculation base. Defining the size of the computation base as f×h, the private key computation base (SCB) and the public key computation base (PCB) can be defined as follows: the public and private key calculation base has the following properties:
property 1: there is a one-to-one correspondence between the private key calculation base SCB and the public key calculation base PCB. Let S11 of SCB be the private key, then P11 of PCB is the public key of S11, because p11= (X11, y 11) =s11g=s11 (X0, yo); similarly, assuming that S21 is a private key, P21 is a public key of S21. By analogy, let Sij be the private key, pij be the public key (i=1..f, j=1..h).
Property 2: in elliptic curve cryptography, if S11 and S21 are private keys and the corresponding public keys are P11 and P21, then when s11+s21=α is the private key, p11+p21=β is the public key α. This is because β=p11+p21= (X11, Y11) + (X21, Y21) =s11g+s21g= (s11+s21) g=αg. Just meets the definition of public and private keys of elliptic curve cryptography.
From property 1, the establishment of the private key calculation base and the public key calculation base is very simple, and from property 2, the realization of the scale of the key by an algorithm provides a basis.
If this property of the computation base can be organically linked to the user identity, it is not difficult to construct an identity-based key system.
Wherein, using discrete logarithm cipher, the defined parameter is T= (g, m), g is integer base less than m, m is modulus. Assuming that the integer s is a private key, then the public key is gs=p mod m, p being an integer. The private key calculation base SCB and the public key calculation base PCB are simulated with the elliptic curve cryptography as in the case of elliptic curve cryptography described above.
Row mapping and column permutation in order to achieve identity-based key distribution, a method must be found that combines the public-private key calculation base with the user identity. The method of binding the public key and the identity is not unique, but the simplest method is random mapping.
For ease of explanation, the identity and public key variable are bound together in a simple manner of encryption (randomly) in the following example. Two algorithms are required for this: a row value calculation method and a column value calculation method.
The row value calculating method comprises the following steps: given a row value key ROWKEY, this is a public variable that is fixed in a constant form.
First, under a HASH algorithm (e.g., MD 5), an indefinite length name identifier (identiy) is transformed into a fixed length variable datal.
HASH (IDENTITY) =datal: under an encryption algorithm (such as AES), taking an intermediate variable datal as data, and encrypting by using a row value key ROWKEY to obtain an intermediate variable MAPo: the intermediate traffic MAPo is used as data, encrypted by a key ROWKEY, and then an intermediate variable MAP1 is used, and so on until the MAP value of the required number is obtained. For convenience of explanation, the calculation base size is set to (32×32) in this example. The key ROWKEY used is provided in the ID authentication card.
The 16 bytes of MAPo are modulo m (m=32 in this example), yielding 16 row values less than m, labeled with map [0] -map [15 ]; the 16 bytes of MAP1 are modulo m, respectively, to yield 16 row values less than m, labeled with MAP [16] -MAP [31 ].
So far, 32 map values are obtained for 32 selections of rows. If map [1] =5, then select line 5 in the private key calculation base or public key calculation base, and if map [2] =21, then select line 21, and so on.
The column value calculating method comprises the following steps: to avoid sequential access of column variables, a permutation algorithm PMT of column variables is provided, the result of the column permutation being one of the full permutations of (0, 1,2, 3., 31). The calculation method is as follows.
First, the KEY pmt_key used by the PMT algorithm is calculated: AEScOLKEY (IDENTITY) = pmt_key: the COLKEY is given in the ID certificate.
Then, PMT_KEY is used as a KEY, and the PMT algorithm is used for encrypting the original sequence to obtain a column replacement value PERMUTS: pmtpmt_key (primitive) =pemut: the original order is 0, 1. PERMUTs are new permuted sequences: σ (0, 1,..31) =t0, t1, t2,..t 31.
Assuming t0, t1, t2, & gt, t 31= (3, 6, 12, & gt, 5), the sequence variables are taken in the new order of 3,6, 12, & gt, 5.
For example, after the above row value calculation and column value calculation, the row values of 32 rows are (7, 13, 29,..11), the column replacement values are (3, 6, 12,..5), and then the variables used in the private key calculation base are s [7,3], s [13,6], s [29, 12],..; the variables used in the public key calculation base are P7, 3, P13, 6, P29, 12, …, P11, 5; because the mapping values are identical when the public key is calculated and the private key is calculated, the taking positions are identical, and the pairing relation of the public key and the private key is ensured.
The definition of the scope parameter scope is to solve the contradiction of opening and closing in the authentication network, and is a key technology for realizing logic isolation. The authentication network is divided into n layers, and for convenience of description, the scope parameters are divided into an A domain, a B domain and a C domain.
Domain a is a parameter that only works within this a range; if there is no interconnection requirement between different A domains and the different A domains are completely independent, then an independent calculation base is used, and the A domain parameters can not be set.
Domain B is a parameter that only works within this B range; the parameters differ between different B-domains.
The C-domain scope is a parameter that only works within this C-range; the parameters differ between different C-domains. This satisfies the need for segmentation.
The A city public key parameter ensures the traffic in the A domain, the B domain public key parameter ensures the traffic in the B domain, and the C city public key parameter ensures the traffic in the C domain. This meets the interworking requirements.
(4) The address of the user A on the Internet is abcde@yahoo.com, and the calculation base size is (32×32); assume that the row values are: MAPo [ I ] mod 32=map [ I ] (i=0, 1,., 15) MAP1[ I ] mod 32=map [ I ] (i=16, 17, …, 31); the column values are: pmtpmt_key (primitive) =t0, t1, t2, …, t31; then, the calculation formula used by the key management center when producing the private key for user a is: sa=σ i=031 ] ] > (s [ [ i ], t [ i ] ]) +scope parameters) mod n; the calculation formula used by each relying party in calculating the public key of user A is: pa=σ i=031 ] ] > (P [ [ i ], t [ i ] ]) +scope parameter) mod m; the public and private key calculation process of elliptic curve cryptography is simulated by using discrete logarithm cryptography as follows: the calculation formula used by the key management center when the private key is produced for the user A is as follows: sa=σ i=031 ] ] > (s [ [ j ], t [ i ] ]) +scope parameters) mod n; the calculation formula used by each relying party in calculating the public key of user A is:
pa=σ i=031 ] ] > (P [ [ i ], t [ i ] ]) +scope parameter) mod m; thus, a public key and private key correspondence is formed that identifies the mail address. Only the key management center stores the private key calculation base, so that the generation of the private key can only be carried out in the key management center; also, since the public key calculation base is public, any relying party can calculate the public key of the counterpart as long as it knows the mail address of the counterpart. Since the calculation of the public key is automatic, it is equivalent to the user's user name (identity) of the other party being directly treated as the public key.
The storage of the public key only keeps own private key and public key calculation base for digital signature and key exchange. Setting: the size of the public key calculation base is (f×h), then the storage amount is (f×h), and the public key amount is (f) h. The table below lists a comparison of the calculated base size and the amount of public key. For example, when the matrix size is (16×64) =1k, the storage amount is 1k, and the public key amount is (16) 64=2256=1077.
Because the number of shared parameters to be stored is very limited and is a public variable, the system can be stored in various most convenient media or places. Such as written directly in a personal ID certificate to each individual or published for sharing on various websites.
The ID certificate comprises a certificate body and a variable body:
the certificate body according to the present invention is almost the same as a general certificate, and mainly defines the user basic attributes such as a name, a job, a level, a validity period, a issuing unit, a signature, and the like, and thus will not be discussed. The certificate body satisfies a classified distributed obligation security policy.
The variable body is the core of the authentication card, and specifically configures related keys and parameter variables, and comprises n different identifications and n scopes. The variable volume includes the following 16 pieces of content. The variable body meets the application type voluntary security policy of dividing roles.
Public key variable item
Certificate issuing item
The above is the main component of the variable, but the public key calculation base and the spare key may be added to the variable.
Public key calculation base item
Spare key item
The certificate is thus composed in three forms: first form: ID certificate = certificate body + variable body; second form: ID certificate = certificate body + variable body + public key calculation base; third form: ID certificate = certificate body + variable body + spare key;
among other things, implementation system examples can construct a trusted authentication system based on the present invention, including office authentication systems, phone and mail authentication systems, ticket authentication systems, proxy (process) authentication systems, and so forth. The system is roughly divided into three major parts: background program, client program, standardized part.
The background program is a program of a key center, which is the highest authority for management. Under the corresponding security policy, the offline production task of the private key is mainly undertaken. Private key production requires configuration of a private key calculation base, production of a corresponding private key according to user identification (telephone number, mail address, personal account number, etc.) provided by a user, recording in a medium under protection of a user password, and issuing the medium to the user for use in an ID certificate mode, wherein the medium is, for example, a smart IC card.
The key parts of the client program are stored in the intelligent IC card, and the intelligent IC card comprises an intelligent IC card operating system with signature, authentication and other functions, a public key calculation program and an ID certificate. Thus, the public key calculation base and the public key calculation program as the common parameters are recorded at the same time in the smart IC card. The key center produces the intelligent IC card with signature, authentication and other functions, public key calculation program and ID certificate of different content.
Since key management is a rather complex system engineering, the system adaptability of the program and the flexibility of the certificates are crucial. The authentication network is a single-layer authentication network, a multi-layer authentication network, a star-shaped authentication network, a grid-shaped authentication network and the like, and is suitable for various authentication networks, the ID certificate formats are the same, but the content of the certificates can be different.
The proxy (process) authentication technology can only be realized by using full software, and the protection of private keys is mainly solved.
The techniques of the present invention may be implemented in software, hardware, or a combination of software and hardware. The methods of the present invention may be embodied in program instructions that when executed by one or more processors perform the methods described herein to achieve the objects of the invention.
While elliptic curve cryptography and discrete logarithmic cryptography are exemplified in the foregoing embodiments, and some specific key generation processes are incorporated to generate a public key from an identification and a small number of public parameters, those skilled in the art will recognize, based on the present disclosure, that other cryptographic mechanisms now available and possibly developed in the future may be employed to generate a public key from an identification and a small number of public parameters, as the scope of the present invention is not limited to the specific cryptographic forms and generation mechanisms disclosed herein, but includes other possible cryptographic forms and generation mechanisms.
Finally, it should be noted that: the foregoing description is only illustrative of the preferred embodiments of the present invention, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described, or equivalents may be substituted for elements thereof, and any modifications, equivalents, improvements or changes may be made without departing from the spirit and principles of the present invention.

Claims (6)

1. The insect type and number image recognition algorithm based on deep learning is characterized in that: the method comprises the following steps:
s1, acquiring pest images, and establishing a first image training data set (namely a characteristic recognition training set of single pests) and a second image training data set (namely a positioning training set of a plurality of target pests);
s2, acquiring a plurality of pest targets (namely, acquisition quantity) in the image through a multi-target positioning method
S3, aiming at each first image containing the pest target, segmenting the first image by adopting a segmentation method to obtain a second binary image, wherein a segmentation formula is as follows:
Figure FDA0003857495050000011
s4, acquiring characteristic information of the pest target according to a second binary image, wherein the characteristic information comprises a color characteristic and a Hu invariant moment characteristic, and the color characteristic is as follows: separating RGB three components of the second binary image to obtain a histogram corresponding to the three components, and calculating to obtain statistical description of the histogram, wherein the formula is as follows:
Figure FDA0003857495050000012
Figure FDA0003857495050000013
Figure FDA0003857495050000014
Figure FDA0003857495050000015
Figure FDA0003857495050000016
Figure FDA0003857495050000017
s5, normalizing the characteristic information so as to convert the pest target into a representation of a characteristic space, wherein a normalization method (a minimum-maximum normalization method is also called dispersion normalization, and for original data, the original data is mapped into a fixed range from a value range through a linear transformation) is as follows:
Figure FDA0003857495050000021
s6, classifying the standardized characteristic information through a classifier, so as to obtain corresponding pest categories, and counting the pests in the same category, thereby realizing the identification and the quantity statistics of the pests.
2. The deep learning-based insect species and number image recognition algorithm of claim 1, wherein: the step S1 is that preprocessing is carried out on the images before the first image training data set and the second image training data set are established, the preprocessing of the first image training set is to amplify and cut out target pests, the preprocessing of the second image training set is to remove noise, and the noise removal is specifically carried out through a filter.
3. The deep learning-based insect species and number image recognition algorithm of claim 1, wherein: the multi-target positioning in the S2 is mainly a watershed marking method:
1) Binarizing the original image by an Ojin method;
2) Determining a minimum circumcircle of a maximum target area in the first binary image, and then marking pixels outside the circumcircle as a determined black background;
3) Performing ellipse fitting on the largest target area in the first binary image, drawing a circle by adopting the major axis of the ellipse as the diameter and the center as the circle center as an inscribed circle;
4) Carrying out corrosion operation on the first binary image, carrying out intersection with an inscribed circle to mark a background area, carrying out expansion operation on the first binary image, and determining a pest area through the inscribed circle;
5) After the marks of the background and the foreground are determined, the original image is directly segmented by using a watershed algorithm, and a plurality of first images containing pest targets are obtained.
4. The deep learning-based insect species and number image recognition algorithm of claim 1, wherein: in the step S3, bw is a second binary image, g (x, y) is a central pixel point of each window, a threshold T is obtained for each pixel point by fixing gray values of all pixels in the large and small windows, and then binarization is performed on the central pixel point of the window by the threshold, so as to obtain the second binary image.
5. The deep learning-based insect species and number image recognition algorithm of claim 1, wherein: in S4, the histogram in the formula is p (z i ) M is the mean value, sigma is the standard deviation, R is the smoothness, u3 is the normalized moment of the histogram, u is the consistency, e is the entropy, L is the gray level of the histogram, the mean value describes the color mean value of the whole target area, the standard deviation describes the non-uniformity of the color distribution, the third moment describes the skewness of the histogram, and the entropy describes the randomness of the histogram.
6. The deep learning-based insect species and number image recognition algorithm of claim 1, wherein: in the formula of S5, Q1A represents a first quartile of the pest area, Q2A represents a second quartile of the pest area, and Q3A represents a third quartile of the pest area.
CN202211157102.5A 2022-09-21 2022-09-21 Deep learning-based insect type and number image recognition algorithm Pending CN116091394A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211157102.5A CN116091394A (en) 2022-09-21 2022-09-21 Deep learning-based insect type and number image recognition algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211157102.5A CN116091394A (en) 2022-09-21 2022-09-21 Deep learning-based insect type and number image recognition algorithm

Publications (1)

Publication Number Publication Date
CN116091394A true CN116091394A (en) 2023-05-09

Family

ID=86210886

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211157102.5A Pending CN116091394A (en) 2022-09-21 2022-09-21 Deep learning-based insect type and number image recognition algorithm

Country Status (1)

Country Link
CN (1) CN116091394A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116682107A (en) * 2023-08-03 2023-09-01 山东国宏生物科技有限公司 Soybean visual detection method based on image processing
CN117975312A (en) * 2024-03-28 2024-05-03 安徽大学 Unmanned aerial vehicle shooting image processing system for identifying pine wood nematode disease

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116682107A (en) * 2023-08-03 2023-09-01 山东国宏生物科技有限公司 Soybean visual detection method based on image processing
CN116682107B (en) * 2023-08-03 2023-10-10 山东国宏生物科技有限公司 Soybean visual detection method based on image processing
CN117975312A (en) * 2024-03-28 2024-05-03 安徽大学 Unmanned aerial vehicle shooting image processing system for identifying pine wood nematode disease
CN117975312B (en) * 2024-03-28 2024-06-07 安徽大学 Unmanned aerial vehicle shooting image processing system for identifying pine wood nematode disease

Similar Documents

Publication Publication Date Title
Qi et al. Cpds: Enabling compressed and private data sharing for industrial Internet of Things over blockchain
Panah et al. On the properties of non-media digital watermarking: a review of state of the art techniques
Joseph et al. Retracted article: a multimodal biometric authentication scheme based on feature fusion for improving security in cloud environment
CN116091394A (en) Deep learning-based insect type and number image recognition algorithm
CN104011781B (en) Information processing device and information processing method
CN111800252A (en) Information auditing method and device based on block chain and computer equipment
CN110784310B (en) Method and device for hiding ciphertext strategy based on attribute encryption
Zhao et al. Iris template protection based on local ranking
CN106850187A (en) A kind of privacy character information encrypted query method and system
CN114944963B (en) Government affair data opening method and system
CN113779355B (en) Network rumor tracing evidence obtaining method and system based on blockchain
CN107688993A (en) A kind of credit information distribution account book system and record dissemination method
CN111475866A (en) Block chain electronic evidence preservation method and system
Gutte et al. Steganography for two and three LSBs using extended substitution algorithm
Blesswin et al. Enhanced semantic visual secret sharing scheme for the secure image communication
Raikhlin et al. The elements of associative stegnanography theory
Wang et al. Practical blockchain-based steganographic communication via adversarial AI: A case study in bitcoin
Zhang et al. Efficient reversible data hiding in encrypted binary image with Huffman encoding and weight prediction
Eltaieb et al. Efficient implementation of cancelable face recognition based on elliptic curve cryptography
CN111756531A (en) Communication system and method of LoRa terminal based on CPK
CN113656826A (en) Anonymous identity management and verification method supporting dynamic change of user attributes
CN116756763A (en) Privacy protection method and system for interaction data of power terminal
CN112910923A (en) Intelligent financial big data processing system
CN111698284A (en) Block chain-based computer encryption system and method
CN109064375B (en) Zero watermark-based large data property identification method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination