CN114782421B - Poultry veterinarian auxiliary system based on egg laying abnormality detection - Google Patents
Poultry veterinarian auxiliary system based on egg laying abnormality detection Download PDFInfo
- Publication number
- CN114782421B CN114782421B CN202210683005.3A CN202210683005A CN114782421B CN 114782421 B CN114782421 B CN 114782421B CN 202210683005 A CN202210683005 A CN 202210683005A CN 114782421 B CN114782421 B CN 114782421B
- Authority
- CN
- China
- Prior art keywords
- egg
- image
- pixel point
- area
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000017448 oviposition Effects 0.000 title claims abstract description 60
- 230000005856 abnormality Effects 0.000 title claims abstract description 58
- 238000001514 detection method Methods 0.000 title claims abstract description 48
- 244000144977 poultry Species 0.000 title claims abstract description 24
- 235000013601 eggs Nutrition 0.000 claims abstract description 339
- 241000238367 Mya arenaria Species 0.000 claims abstract description 22
- 238000000034 method Methods 0.000 claims abstract description 19
- 210000003278 egg shell Anatomy 0.000 claims description 48
- 102000002322 Egg Proteins Human genes 0.000 claims description 46
- 108010000912 Egg Proteins Proteins 0.000 claims description 46
- 238000004364 calculation method Methods 0.000 claims description 21
- 230000002159 abnormal effect Effects 0.000 claims description 17
- 230000036244 malformation Effects 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000003708 edge detection Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 abstract description 8
- 230000008859 change Effects 0.000 description 4
- 230000001105 regulatory effect Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- OYPRJOBELJOOCE-UHFFFAOYSA-N Calcium Chemical compound [Ca] OYPRJOBELJOOCE-UHFFFAOYSA-N 0.000 description 1
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- QYSXJUFSXHHAJI-XFEUOLMDSA-N Vitamin D3 Natural products C1(/[C@@H]2CC[C@@H]([C@]2(CCC1)C)[C@H](C)CCCC(C)C)=C/C=C1\C[C@@H](O)CCC1=C QYSXJUFSXHHAJI-XFEUOLMDSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009395 breeding Methods 0.000 description 1
- 230000001488 breeding effect Effects 0.000 description 1
- 239000011575 calcium Substances 0.000 description 1
- 229910052791 calcium Inorganic materials 0.000 description 1
- 210000003555 cloaca Anatomy 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003101 oviduct Anatomy 0.000 description 1
- 239000011574 phosphorus Substances 0.000 description 1
- 229910052698 phosphorus Inorganic materials 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- QYSXJUFSXHHAJI-YRZJJWOYSA-N vitamin D3 Chemical compound C1(/[C@@H]2CC[C@@H]([C@]2(CCC1)C)[C@H](C)CCCC(C)C)=C\C=C1\C[C@@H](O)CCC1=C QYSXJUFSXHHAJI-YRZJJWOYSA-N 0.000 description 1
- 235000005282 vitamin D3 Nutrition 0.000 description 1
- 239000011647 vitamin D3 Substances 0.000 description 1
- 229940021056 vitamin d3 Drugs 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/70—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in livestock or poultry
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
Abstract
The invention relates to the technical field of data processing, in particular to a poultry and veterinary auxiliary system based on egg laying abnormality detection. According to the invention, the egg-laying abnormality condition of whether the eggs to be detected are soft-shell eggs can be accurately determined by acquiring the visible light images of the eggs to be detected and carrying out corresponding data processing on the visible light images, so that the whole process does not need human participation, and the egg-laying abnormality detection cost is effectively reduced.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a poultry veterinarian auxiliary system based on egg-laying abnormality detection.
Background
The poultry often has abnormal egg laying conditions such as rapid egg laying rate reduction, abnormal egg laying (such as soft-shell eggs, thick-shell eggs, malformed eggs and the like), no egg laying peak period or continuous low egg laying rate and the like in the egg laying period, and economic losses of different degrees are caused to poultry farmers. Therefore, when abnormal egg laying occurs, the abnormal egg laying condition needs to be found in time, and then factors such as the egg laying environment of the poultry and the like are corrected and changed in time, so that the abnormal egg laying condition is reduced, and the economic loss is reduced in time.
The existing egg laying abnormality detection is usually manual detection, but because the egg laying abnormality occurrence probability is not very high and egg laying is possible at night, the cost of manual detection is greatly increased, and therefore an unsupervised egg laying abnormality automatic detection method is urgently needed to reduce the cost of egg laying abnormality detection.
Disclosure of Invention
The invention aims to provide a poultry veterinarian auxiliary system based on egg laying abnormality detection, which is used for solving the problem that the cost for manually detecting egg laying abnormality is high in the prior art.
In order to solve the technical problems, the invention provides a poultry veterinary auxiliary system based on egg laying abnormality detection, which comprises an image acquisition module, an egg laying abnormality detection module and an egg laying abnormality feedback reporting module, wherein the image acquisition module is used for acquiring visible light images of eggs to be detected and sending the visible light images to the egg laying abnormality detection module, the egg laying abnormality detection module is used for receiving the visible light images, carrying out data processing based on the visible light images, determining the abnormality detection result of the eggs to be detected and sending the abnormality detection result to the egg laying abnormality feedback reporting module, and the egg laying abnormality feedback reporting module is used for sending the abnormality detection result to a background or a far end;
the method for determining the abnormal detection result of the egg to be detected comprises the following steps:
according to a visible light image of an egg to be detected, data processing is carried out on the visible light image, and therefore an egg area image is obtained;
performing data processing on the egg area image to obtain an egg area gray image, and determining the actual gradient direction and gradient value of each pixel point in the egg area gray image according to the gray value of each pixel point in the egg area gray image;
determining a reference pixel point in the gray image of the egg area, and determining an ideal gradient direction of each pixel point in the gray image of the egg area relative to the reference pixel point according to the position of each pixel point in the gray image of the egg area and the position of the reference pixel point;
determining a gradient direction threshold corresponding to each pixel point in the gray image of the egg area according to the gray value, the gradient value and the position of each pixel point in the gray image of the egg area and the position of the reference pixel point;
determining a texture feature operator of each pixel point in the gray image of the egg area according to the actual gradient direction of each pixel point in the gray image of the egg area, the ideal gradient direction of each pixel point relative to the reference pixel point and the gradient direction threshold value corresponding to each pixel point, thereby obtaining a texture feature image;
and determining whether the egg to be detected is a soft-shell egg or not according to the texture feature operator of each pixel point in the texture feature image.
Further, the calculation formula for determining the ideal gradient direction of each pixel point in the gray level image of the egg area relative to the reference pixel point is as follows:
wherein,is the first in the grey scale image of the egg areaiThe ideal gradient direction of each pixel point relative to the reference pixel point,is the first in the grey scale image of the egg areaiThe vertical coordinate of each pixel point is determined,is the first in the grey scale image of the egg areaiThe abscissa of each pixel point is given by its abscissa,is the ordinate of the reference pixel point in the gray level image of the egg area,the abscissa of a reference pixel point in the gray level image of the egg area is shown.
Further, the determining a gradient direction threshold corresponding to each pixel point in the gray-scale image of the egg area includes:
determining the distance between each pixel point in the gray image of the egg area and the reference pixel point according to the position of each pixel point in the gray image of the egg area and the position of the reference pixel point, thereby obtaining a first gradient threshold value adjusting factor of each pixel point in the gray image of the egg area;
determining a second gradient threshold adjustment factor of each pixel point in the gray image of the egg area according to the gray value and the gradient value of each pixel point in the gray image of the egg area;
and calculating the gradient direction threshold corresponding to each pixel point in the gray level image of the egg area according to the first gradient threshold adjusting factor and the second gradient threshold adjusting factor of each pixel point in the gray level image of the egg area.
Further, a calculation formula corresponding to the second gradient threshold adjustment factor for determining each pixel point in the gray level image of the egg area is as follows:
wherein,is the first in the grey scale image of the egg areaiA second gradient threshold adjustment factor for each pixel,is the first in the grey scale image of the egg areaiThe gray value of each pixel point is calculated,is the first in the grey scale image of the egg areaiGradient values of the individual pixels.
Further, a calculation formula for calculating the corresponding gradient direction threshold value of each pixel point in the gray level image of the egg area is as follows:
wherein,is the first in the grey scale image of the egg areaiThe gradient direction threshold corresponding to each pixel point,is a fixed value in the gradient direction,is the first in the grey scale image of the egg areaiOf a pixelA first gradient threshold adjustment factor is set by the first gradient threshold adjustment factor,is the first in the grey scale image of the egg areaiAnd adjusting factors of the second gradient threshold of each pixel point.
Further, the determining a texture feature operator of each pixel point in the gray level image of the egg area includes:
determining the ternary codes of all pixel points in the gray level image of the egg area according to the actual gradient direction of all the pixel points in the gray level image of the egg area, the ideal gradient direction of all the pixel points relative to the reference pixel points and the gradient direction threshold value corresponding to all the pixel points;
and determining texture feature operators of all pixel points in the gray level image of the egg area according to the ternary codes of the eight neighborhood pixel points of all the pixel points in the gray level image of the egg area.
Further, the calculation formula for determining the correspondence of the three-value codes of all pixel points in the gray level image of the egg area is as follows:
wherein,is the first in the grey scale image of the egg areaiThree-value coding of each pixel point is carried out,is the first in the grey scale image of the egg areaiThe actual gradient direction of each pixel point is determined,is the first in the grey scale image of the egg areaiThe ideal gradient direction of each pixel point relative to the reference pixel point,is the first in the grey scale image of the egg areaiLadder corresponding to each pixel pointA degree direction threshold.
Further, the method further comprises:
and if the eggs to be detected are not soft-shell eggs, performing edge detection on the texture feature images, and if an edge area is detected, determining that the eggs to be detected are pockmarked eggs and the edge area is the pockmarked area.
Further, the method further comprises:
if the eggs to be detected are not soft-shell eggs, constructing a gray level co-occurrence matrix according to the gray level value of each pixel point in the gray level image of the egg area, and determining a contrast description operator and an energy description operator based on the gray level co-occurrence matrix;
comparing the contrast descriptor and the energy descriptor with the contrast descriptor standard value and the energy descriptor standard value respectively to obtain the eggshell roughness of the grey image of the egg area;
performing edge detection on the gray level image of the egg area to obtain an edge profile of the egg area, and performing circular fitting and rectangular fitting on the edge profile of the egg area respectively to obtain a fitted circle and a fitted rectangle;
determining a first shape index and a second shape index of the gray level image of the egg area according to the area of the edge contour of the egg area, the area of the fitting circle and the length and width of the fitting rectangle;
comparing the first shape index and the second shape index of the egg area gray level image with the first shape index standard value and the second shape index standard value respectively to obtain the eggshell malformation degree of the egg area gray level image;
according to the eggshell roughness of the egg area gray level image, whether the egg to be detected is an eggshell rough egg or not is judged, and whether the egg to be detected is an eggshell malformed egg or not is judged according to the eggshell malformation degree of the egg area gray level image.
Further, the calculation formula for determining the correspondence between the first shape index and the second shape index of the gray level image of the egg area is as follows:
wherein,and B are respectively a first shape index and a second shape index of the gray level image of the egg area,in order to fit the area of the circle,is the area of the edge contour of the egg area,andrespectively the length and width of the fitted rectangle.
The invention has the following beneficial effects: the method comprises the steps of obtaining a visible light image of an egg to be detected, carrying out data processing on the visible light image to obtain an egg region gray image, determining the actual gradient direction of each pixel point in the egg region gray image, the ideal gradient direction of each pixel point relative to a reference pixel point and the gradient direction threshold value corresponding to each pixel point to obtain a texture feature image, and determining whether the egg to be detected is a soft-shell egg or not based on the texture feature image. According to the invention, the visible light image of the egg to be detected is obtained, and the visible light image is subjected to corresponding data processing, so that whether the egg to be detected is a soft-shell egg or not can be accurately determined, the whole process does not need human participation, and the egg laying abnormality detection cost is effectively reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flow chart of a method implemented by a poultry veterinary assistance system based on egg laying abnormality detection according to an embodiment of the present invention.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the technical solutions according to the present invention will be given with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
Specific scenes are as follows: under the condition of breeding poultry to lay eggs, whether the laid eggs are abnormal eggs such as soft-shell eggs, pockmarked eggs, thick-shell eggs, malformed eggs and the like is detected.
To above-mentioned specific scene, this embodiment provides a poultry animal doctor auxiliary system based on egg laying abnormality detects, this system includes the supplementary body device of poultry animal doctor, still including the image acquisition module that connects gradually, egg laying abnormality detection module and egg laying abnormality feedback report module, this image acquisition module is used for acquireing the visible light image of waiting to detect the egg, and send the visible light image for egg laying abnormality detection module, this egg laying abnormality detection module is used for receiving the visible light image, and carry out data processing based on the visible light image, confirm the abnormality detection result of waiting to detect the egg, and send the abnormality detection result for egg laying abnormality feedback report module, egg laying abnormality feedback report module is used for sending the abnormality detection result for backstage or distal end.
The method for processing the data by the egg laying abnormality detection module based on the visible light image and determining the abnormality detection result of the egg to be detected is shown in fig. 1, and comprises the following steps:
step S1: and carrying out data processing on the visible light image according to the visible light image of the egg to be detected, thereby obtaining an egg area image.
At the automated poultry farm, the eggs produced by the poultry are photographed using an image capture module of the poultry veterinary assistance system, such as an industrial camera, to obtain visible light images. It is emphasized that in photographing eggs laid by poultry, an industrial camera is disposed at one side of the egg under a single natural light irradiation, and the egg is photographed from the side, and the side center position of the eggshell is opposite to the center of the camera.
After the visible light image is collected, the eggs in the visible light image are identified by utilizing a semantic segmentation neural network, so that an egg area image is obtained. Since the specific implementation process of identifying eggs in the visible light image by using the semantic segmentation neural network belongs to the prior art, the detailed description is omitted here.
The method comprises the steps of obtaining a visible light image of an egg to be detected, using ambient light, obtaining an image of an egg area, and obtaining a central highlight image of the egg area.
In combination with the illumination characteristics of normal eggs, the eggshell surface is known to be uniformly changing. And the acquired image is gradually lightened from the periphery to the middle of the eggshell, so the gradient direction of the image is intensively directed to the center of the eggshell. When the egg to be detected is a soft-shell egg, the uppermost part of the eggshell is sunken to form a pit, and the situation that the gradient direction of the image is concentrated to the center of the eggshell can be avoided. Therefore, whether the eggs to be detected are soft-shell eggs or not can be identified by analyzing the gradient condition of the pixel points of the gray level images of the egg areas, and the specific implementation process is as follows:
step S2: and performing data processing on the egg area image to obtain an egg area gray image, and determining the actual gradient direction and gradient value of each pixel point in the egg area gray image according to the gray value of each pixel point in the egg area gray image.
And carrying out gray level processing on the egg area image to obtain an egg area gray level image. According to the gray value of each pixel point in the gray image of the egg area, the Sobel operator is used for carrying out gradient calculation on the gray image of the egg area, so that the gradient direction and the gradient value of each pixel point in the gray image of the egg area are obtained, in order to distinguish the gradient direction from the ideal gradient direction of the subsequent pixel point, the gradient direction of each pixel point obtained through the gradient calculation is called as the actual gradient direction.
Step S3: and determining a reference pixel point in the gray image of the egg area, and determining the ideal gradient direction of each pixel point in the gray image of the egg area relative to the reference pixel point according to the position of each pixel point in the gray image of the egg area and the position of the reference pixel point.
For a normal egg, the gradient direction of the image is to be concentrated to the center of the eggshell, and based on the characteristic, a reference pixel point in the center of the eggshell is determined. Because the center position of the side surface of the eggshell is over against the center of the camera when the visible light image is obtained, theoretically, the reference pixel point is the center pixel point of the gray image of the egg area. However, considering that the center of the camera may slightly deviate from the center of the side surface of the egg shell, in this embodiment, the pixel point corresponding to the maximum gray value is determined according to the gray value of each pixel point in the gray image of the egg area, and the pixel point corresponding to the maximum gray value is used as the reference pixel point.
After the reference pixel point in the gray image of the egg area is determined, the ideal gradient direction of each pixel point in the gray image of the egg area relative to the reference pixel point can be determined according to the positions of other pixel points in the image relative to the reference pixel point, and the corresponding calculation formula is as follows:
wherein,is the first in the grey scale image of the egg areaiThe ideal gradient direction of each pixel point relative to the reference pixel point,is the first in the grey scale image of the egg areaiThe vertical coordinate of each pixel point is set,is the first in the grey scale image of the egg areaiThe abscissa of each pixel point is given by its coordinate,is the ordinate of the reference pixel point in the gray scale image of the egg area,the abscissa of a reference pixel point in the gray level image of the egg area is shown.
Step S4: and determining a gradient direction threshold corresponding to each pixel point in the gray image of the egg area according to the gray value, the gradient value and the position of each pixel point in the gray image of the egg area and the position of the reference pixel point.
After the actual gradient direction and the ideal gradient direction of each pixel point in the gray image of the egg area are obtained through the steps S2 and S3, the texture feature operator of each pixel point in the gray image of the egg area can be determined through calculating the difference between the actual gradient direction and the ideal gradient direction, so that a texture feature image is obtained, and whether the egg to be detected is a soft-shell egg or not can be identified based on the texture feature image. However, it is considered that, because the egg body to be detected is irregular ellipsoid, even if the egg body is a normal egg, the deviation between the actual gradient direction and the ideal gradient direction of the pixel point may be large, and further, the false detection may occur during the detection. Therefore, in the embodiment, the gradient direction threshold corresponding to each pixel point is determined in a self-adaptive manner by analyzing the gray value and the gradient value of each pixel point and the position of each pixel point relative to the reference pixel point, so that the robustness to illumination change is enhanced, and an accurate texture feature image is finally determined. The specific implementation steps of adaptively determining the gradient direction threshold corresponding to each pixel point comprise:
a first substep: and determining the distance between each pixel point in the gray image of the egg area and the reference pixel point according to the position of each pixel point in the gray image of the egg area and the position of the reference pixel point, thereby obtaining a first gradient threshold value regulating factor of each pixel point in the gray image of the egg area.
By analyzing the gray level image of the egg area, the closer to the edge of the image, the more likely the actual gradient direction of the pixel point accords with the ideal gradient direction, that is, the less likely the deviation between the actual gradient direction and the ideal gradient direction occurs, so that different weights can be given according to the position of the pixel point in the image, and the closer to the edge of the image, the smaller the threshold value of the corresponding gradient direction of the pixel point is. In order to measure the positions of pixel points in an image, the distance between each pixel point in a gray image of an egg area and a reference pixel point is calculated, a first gradient threshold value adjusting factor is determined according to the distance, the first gradient threshold value adjusting factor can be regarded as a weight, and the corresponding calculation formula is as follows:
wherein,is the first in the grey scale image of the egg areaiA first gradient threshold adjustment factor for each pixel,andrespectively in the gray scale image of the egg areaiThe abscissa and ordinate of each pixel point,andrespectively, the abscissa and the ordinate of the reference pixel point.
The second sub-step: and determining a second gradient threshold adjustment factor of each pixel point in the gray image of the egg area according to the gray value and the gradient value of each pixel point in the gray image of the egg area.
Considering that other information cannot be observed due to brightness when detection is carried out in a highlight area, and according to the characteristic that defects occur in soft-shell eggs, the egg defects mainly occur in the upper portion of an egg body, namely the upper portion of an ellipsoid, and the obtained highlight portion is mostly the middle portion of the side face of the ellipsoid when images are shot, so that when the actual gradient direction is compared with the ideal gradient direction, a larger threshold value can be given to the highlight area, namely, the gradient direction difference of the highlight area is ignored. And because the image is judged according to the gradient direction, when the gradient value of the original image is greatly changed, the gradient direction at the position needs to pay more attention, namely, the larger the gradient value of the pixel point in the image is, the smaller the subsequent corresponding threshold value is.
Based on the analysis, determining a second gradient threshold value adjusting factor of each pixel point in the gray image of the egg area according to the gray value and the gradient value of each pixel point in the gray image of the egg area, wherein the corresponding calculation formula is as follows:
wherein,is the first in the grey scale image of the egg areaiA second gradient threshold adjustment factor for each pixel,is the first in the grey scale image of the egg areaiEach pixel pointIs measured in a predetermined time period, and the gray value of (b),is the first in the grey scale image of the egg areaiGradient values of the individual pixels.
The third step: calculating a gradient direction threshold corresponding to each pixel point in the gray level image of the egg area according to the first gradient threshold regulating factor and the second gradient threshold regulating factor of each pixel point in the gray level image of the egg area, wherein the corresponding calculation formula is as follows:
wherein,is the first in the grey scale image of the egg areaiThe gradient direction threshold corresponding to each pixel point,the fixed value of the gradient direction essentially means an error value that can be tolerated between the actual gradient direction and the ideal gradient direction without considering the influence of pixel point positions, gray scale and other factors,can be determined according to experience, or can be determined by calculating the absolute value of the difference between the actual gradient direction and the ideal gradient direction of each pixel point in the gray level image of the egg area and then utilizing the Otsu threshold method according to the absolute values of the differences,is the first in the grey scale image of the egg areaiA first gradient threshold adjustment factor for each pixel,is the first in the grey scale image of the egg areaiSecond of each pixel pointA gradient threshold adjustment factor.
Step S5: and determining a texture feature operator of each pixel point in the gray image of the egg area according to the actual gradient direction of each pixel point in the gray image of the egg area, the ideal gradient direction of each pixel point relative to the reference pixel point and the gradient direction threshold value corresponding to each pixel point, thereby obtaining the texture feature image.
Determining a texture characteristic operator of each pixel point by combining the actual gradient direction and the ideal gradient direction of each pixel point in the gray image of the egg area and a gradient direction threshold value determined in a self-adaptive manner, wherein the specific implementation steps comprise:
a first substep: determining the ternary codes of all the pixel points in the gray image of the egg area according to the actual gradient direction of all the pixel points in the gray image of the egg area, the ideal gradient direction of all the pixel points relative to the reference pixel point and the gradient direction threshold value corresponding to all the pixel points, wherein the corresponding calculation formula is as follows:
wherein,is the first in the grey scale image of the egg areaiThe three-value coding of each pixel point,is the first in the grey scale image of the egg areaiThe actual gradient direction of each pixel point is determined,is the first in the grey scale image of the egg areaiThe ideal gradient direction of each pixel point relative to the reference pixel point,is the first in the grey scale image of the egg areaiAnd the corresponding gradient direction threshold value of each pixel point.
The second sub-step: and determining texture feature operators of all pixel points in the gray level image of the egg area according to the ternary codes of the eight neighborhood pixel points of all the pixel points in the gray level image of the egg area.
For each pixel point in the gray image of the egg area, eight neighborhood pixel points are numbered, namely the eight neighborhood pixel point on the right left side of each pixel point is the first eight neighborhood pixel point, and then the second eight neighborhood pixel point, the third eight neighborhood pixel point, … … and the eighth eight neighborhood pixel point are sequentially obtained according to the clockwise direction. According to the three-value coding of the eight-neighborhood pixel point of the pixel point and the serial number of the eight-neighborhood pixel point, coding is respectively carried out according to a positive value and a negative value, so that a first texture feature operator and a second texture feature operator corresponding to the pixel point are obtained, and formulas corresponding to the first texture feature operator and the second texture feature operator are obtained as follows:
wherein,andrespectively the first in the grey scale image of the egg areaiA first texture feature operator and a second texture feature operator of each pixel point,as the first in the grey scale image of the egg areaiThe first of each pixel pointmThree-valued coding of eight neighborhood pixels, and,as the first in the grey scale image of the egg areaiThe first of each pixel pointnThree-valued coding of eight neighborhood pixels, and。
in the process of respectively coding each pixel point in the gray image of the egg area according to positive, negative and positive to obtain a first texture feature operator and a second texture feature operator, three conditions of ternary codes corresponding to each eight-neighborhood pixel point of each pixel point in the gray image of the egg area are provided, namely the ternary codes are-1, 0 or +1, when the first texture feature operator is determined, the ternary codes are obtained only according to eight-neighborhood pixel points with the ternary codes of +1 in 8 eight-neighborhood pixel points, and when the second texture feature operator is determined, the ternary codes are obtained only according to eight-neighborhood pixel points with the ternary codes of-1 in 8 eight-neighborhood pixel points. As shown in Table 1 below, only eight neighborhood pixels with a ternary code of +1 are considered in the positive encoding process, and at this time=2 2 +2 3 = 12; when negative coding is carried out, only eight neighborhood pixel points with three-value coding as-1 are considered, and at the moment=2 6 +2 7 =192。
TABLE 1
1 | 1 | 0 |
0 | 0 | |
0 | -1 | -1 |
After a first texture feature operator and a second texture feature operator of each pixel point in the gray level image of the egg area are obtained, two texture feature operator graphs can be obtained, and the two texture feature operator graphs respectively reflect the gradient direction features in the gray level image of the egg area from different angles. And overlapping the pixel values of the same pixel point in the two texture feature operator images, namely overlapping a first texture feature operator and a second texture feature operator of the same pixel point, so as to obtain the texture feature operator of each pixel point in the egg region gray level image, and further obtain the texture feature image of the egg region gray level image.
According to the determination mode of the texture feature operator of each pixel point in the texture feature image, for each pixel point in the texture feature image, when the difference between the actual gradient direction and the ideal gradient direction of any eight-neighborhood pixel point of the pixel point is larger than the gradient direction threshold corresponding to any eight-neighborhood pixel point, the gradient direction difference of any eight-neighborhood pixel point of the pixel point is considered to be larger, and when the number of the gradient direction differences in the eight-neighborhood pixel points of the pixel point is larger, the larger the texture feature operator corresponding to the pixel point is, the brighter the texture feature image is.
Step S6: and determining whether the egg to be detected is a soft-shell egg or not according to the texture feature operator of each pixel point in the texture feature image.
After the texture feature image corresponding to the egg area gray level image is obtained, the texture feature image is analyzed, so that the texture of the dark area part in the image is in accordance with the prediction direction, namely the texture of the area part with the smaller texture feature operator in the image is in accordance with the prediction direction, the gradient direction of the bright area part is in accordance with the texture change of a normal egg, namely the area part with the larger texture feature operator in the image is in accordance with the texture change of the normal egg.
Therefore, the average textural feature operator can be obtained by calculating the average value of the textural feature operators of all the pixel points in the textural feature imageThe average texture feature operator may be regarded as an average gray value of the texture feature image, the average gray value of the texture feature image may represent a uniformity degree of the eggshell gray, and when the average gray value is lower, it is regarded that the eggshell gray changes more uniformly. Setting texture feature operator thresholdIn this embodiment, the threshold of the texture feature operator is setJudging the average gray valueAnd texture feature operator thresholdThe size between, ifAnd judging that the egg to be detected is a soft-shell egg.
Step S7: and if the egg to be detected is not the soft-shell egg, performing data processing on the texture characteristic image, and judging whether the egg to be detected is the spotted egg.
And if the eggs to be detected are not soft-shell eggs, performing edge detection on the texture feature images, and if an edge area is detected, determining that the eggs to be detected are pockmarked eggs and the edge area is the pockmarked area.
As a result of analyzing the texture feature image obtained in step S5, when a pock occurs on the eggshell, the change in the direction of the gray scale gradient in the vicinity of the pock is not uniform, and is reflected in the texture feature image, that is, some areas with high brightness exist. Therefore, the edge detection is performed on the texture feature image, and the obtained edge contour is the pockmark area, which indicates that the egg to be detected is the pockmark egg.
Step S8: if the to-be-detected eggs are not soft-shell eggs, carrying out data processing on the gray level images of the egg areas, and judging whether the to-be-detected eggs are rough eggshells and deformed eggshells.
A first substep: and constructing a gray level co-occurrence matrix according to the gray level value of each pixel point in the gray level image of the egg area, and determining a contrast description operator and an energy description operator based on the gray level co-occurrence matrix.
And constructing a gray level co-occurrence matrix of the gray level image of the egg area, and further determining a contrast descriptor Con and an energy descriptor Asm of the gray level co-occurrence matrix. The contrast description operator Con reflects the definition of the image and the depth of the grooves of the texture, and the energy description operator Asm reflects the uniformity of the gray level distribution of the image and the thickness of the texture. Since the specific process of obtaining the contrast descriptor Con and the energy descriptor Asm of the gray level co-occurrence matrix belongs to the prior art, the detailed description is omitted here.
The second sub-step: comparing the contrast descriptor and the energy descriptor with the contrast descriptor standard value and the energy descriptor standard value respectively to obtain the eggshell roughness of the grey image of the egg area, wherein the corresponding calculation formula is as follows:
wherein,the roughness of the eggshell of the gray level image of the egg area,andrespectively a contrast descriptor and an energy descriptor,andrespectively a contrast descriptor standard value and an energy descriptor standard value,andis determined according to the gray level image of the egg area of the normal standard egg of the same type as the egg to be detected.
The third step: and performing edge detection on the gray level image of the egg area to obtain an edge profile of the egg area, and performing circular fitting and rectangular fitting on the edge profile of the egg area respectively to obtain a fitted circle and a fitted rectangle.
And (3) carrying out edge detection on the gray level image of the egg area by using a canny operator to obtain an image edge, and fitting the image edge to obtain an edge profile of the egg area. Carry out circular fitting and rectangle fitting respectively to this egg region edge profile to obtain fitting circular and fitting rectangle, because the concrete realization process of carrying out circular fitting and rectangle fitting belongs to prior art, no longer give unnecessary details here.
The fourth step: determining a first shape index and a second shape index of the gray level image of the egg area according to the area of the edge contour of the egg area, the area of the fitting circle and the length and width of the fitting rectangle, wherein the corresponding calculation formula is as follows:
wherein,and B are respectively a first shape index and a second shape index of the gray level image of the egg area,in order to fit the area of the circle,is the area of the edge contour of the egg area,andrespectively the length and width of the fitted rectangle.
A fifth step of: comparing the first shape index and the second shape index of the egg area gray level image with the first shape index standard value and the second shape index standard value respectively to obtain the eggshell malformation degree of the egg area gray level image, wherein the corresponding calculation formula is as follows:
wherein,Kthe degree of eggshell deformity of the gray-scale image of the egg area,andrespectively a first shape index and a second shape index,anda first shape index standard value and a second shape index standard value,andis determined according to the gray image of the egg area of the normal standard egg of the same type as the egg to be detected.
Degree of eggshell deformity according to gray level image of egg areaKThe calculation formula shows that the egg shell malformation degreeKIs obtained by comparing the difference between the first shape index and the second shape index corresponding to the egg to be detected and the first shape index standard value and the second shape index standard value determined according to the normal standard egg, when the difference is smaller, the degree of the malformation of the eggshell is more obviousKThe smaller the size, the better the shape of the egg to be detected.
The sixth substep: according to the eggshell roughness of the egg area gray level image, whether the egg to be detected is an eggshell rough egg or not is judged, and whether the egg to be detected is an eggshell malformed egg or not is judged according to the eggshell malformation degree of the egg area gray level image.
Presetting a threshold value of the roughness degree of the eggshellAnd threshold value of degree of abnormality of eggshellIn the present embodiment arrangementAnd isAnd = 0.3. The roughness degree of the eggshell of the gray level image of the egg areaCAnd egg shell roughness thresholdMake a comparison ifIt can be considered as the examinationAnd if the egg is detected to have the abnormal situation of the rough eggshell, judging that the egg to be detected is the egg with the rough eggshell. The degree of eggshell deformity of the gray level image of the egg areaKAnd threshold of egg shell malformation degreeMake a comparison ifIf the egg to be detected has the abnormal situation of the eggshell malformation, the egg to be detected is judged to be the eggshell malformed egg.
When the egg to be detected has one or more egg laying abnormality conditions at the same time through the steps, the egg laying abnormality detection module sends the egg laying abnormality detection result to the egg laying abnormality feedback reporting module, and the egg laying abnormality feedback reporting module reports the egg laying abnormality detection result to the background or the far end in time. And (4) giving out referential abnormal reasons and solutions by combining the specific egg laying abnormal conditions in the background or the far end. If soft shell eggs and thin shell eggs appear, the deficiency of vitamin D3 or the deficiency of calcium and phosphorus in the feed and improper proportion mostly result; the pockmarked egg shell indicates that the poultry oviduct or cloaca is inflamed. According to the abnormal reasons of the abnormal egg laying conditions, the egg laying environment of the poultry is changed in time, and the abnormal egg laying conditions are reduced or eliminated, so that the economic loss caused by the abnormal egg laying conditions can be reduced.
According to the invention, the visible light image of the egg to be detected is obtained, and the data identification and processing are carried out on the visible light image, so that whether the egg to be detected is an abnormal condition such as a soft-shell egg, a pockmarked egg, a rough-eggshell egg, an eggshell deformed egg and the like can be automatically and accurately determined, no human participation is required, and the cost of egg laying abnormality detection is effectively reduced.
It should be noted that: the above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (5)
1. The poultry veterinarian auxiliary system based on egg laying abnormality detection is characterized by comprising an image acquisition module, an egg laying abnormality detection module and an egg laying abnormality feedback reporting module, wherein the image acquisition module is used for acquiring visible light images of eggs to be detected and sending the visible light images to the egg laying abnormality detection module, the egg laying abnormality detection module is used for receiving the visible light images, performing data processing based on the visible light images, determining abnormality detection results of the eggs to be detected and sending the abnormality detection results to the egg laying abnormality feedback reporting module, and the egg laying abnormality feedback reporting module is used for sending the abnormality detection results to a background or a far end;
the method for determining the abnormal detection result of the egg to be detected comprises the following steps:
according to the visible light image of the egg to be detected, carrying out data processing on the visible light image so as to obtain an egg area image;
performing data processing on the egg area image to obtain an egg area gray image, and determining the actual gradient direction and gradient value of each pixel point in the egg area gray image according to the gray value of each pixel point in the egg area gray image;
determining reference pixel points in the gray image of the egg area, and determining the ideal gradient direction of each pixel point in the gray image of the egg area relative to the reference pixel points according to the position of each pixel point in the gray image of the egg area and the position of the reference pixel points;
determining a gradient direction threshold corresponding to each pixel point in the gray image of the egg area according to the gray value, the gradient value and the position of each pixel point in the gray image of the egg area and the position of the reference pixel point;
determining a texture feature operator of each pixel point in the gray image of the egg area according to the actual gradient direction of each pixel point in the gray image of the egg area, the ideal gradient direction of each pixel point relative to the reference pixel point and the gradient direction threshold value corresponding to each pixel point, thereby obtaining a texture feature image;
determining whether the egg to be detected is a soft-shell egg or not according to the texture feature operator of each pixel point in the texture feature image;
the determining of the gradient direction threshold corresponding to each pixel point in the gray level image of the egg area comprises the following steps:
determining the distance between each pixel point in the gray image of the egg area and the reference pixel point according to the position of each pixel point in the gray image of the egg area and the position of the reference pixel point, thereby obtaining a first gradient threshold value adjusting factor of each pixel point in the gray image of the egg area;
determining a second gradient threshold adjustment factor of each pixel point in the gray image of the egg area according to the gray value and the gradient value of each pixel point in the gray image of the egg area;
calculating a gradient direction threshold corresponding to each pixel point in the gray level image of the egg area according to the first gradient threshold adjustment factor and the second gradient threshold adjustment factor of each pixel point in the gray level image of the egg area;
determining a calculation formula corresponding to the second gradient threshold adjustment factor of each pixel point in the gray level image of the egg area as follows:
wherein,is the first in the grey scale image of the egg areaiA second gradient threshold adjustment factor for each pixel,is the first in the grey scale image of the egg areaiThe gray value of each pixel point is the first gray value in the gray image of the egg areaiGradient values of the individual pixel points;
the calculation formula for calculating the corresponding gradient direction threshold value of each pixel point in the gray level image of the egg area is as follows:
wherein,is the first in the grey scale image of the egg areaiThe gradient direction threshold corresponding to each pixel point,is a fixed value in the gradient direction,is the first in the grey scale image of the egg areaiA first gradient threshold adjustment factor for each pixel,is the first in the grey scale image of the egg areaiA second gradient threshold adjustment factor for each pixel point;
the determining of the texture feature operator of each pixel point in the gray level image of the egg area comprises the following steps:
determining the ternary codes of all pixel points in the gray level image of the egg area according to the actual gradient direction of all the pixel points in the gray level image of the egg area, the ideal gradient direction of all the pixel points relative to the reference pixel points and the gradient direction threshold value corresponding to all the pixel points;
determining texture feature operators of all pixel points in the gray level image of the egg area according to the ternary codes of the eight neighborhood pixel points of all the pixel points in the gray level image of the egg area;
determining a calculation formula corresponding to the three-value codes of all pixel points in the gray level image of the egg area as follows:
wherein,is the first in the grey scale image of the egg areaiThe three-value coding of each pixel point,is the first in the grey scale image of the egg areaiThe actual gradient direction of each pixel point is determined,is the first in the grey scale image of the egg areaiThe ideal gradient direction of each pixel point relative to the reference pixel point,is the first in the grey scale image of the egg areaiAnd the corresponding gradient direction threshold value of each pixel point.
2. The poultry veterinary auxiliary system based on egg-laying abnormality detection according to claim 1, wherein the calculation formula for determining the ideal gradient direction of each pixel point in the gray image of the egg area relative to the reference pixel point is:
wherein,is the first in the grey scale image of the egg areaiThe ideal gradient direction of each pixel point relative to the reference pixel point,is the first in the grey scale image of the egg areaiThe vertical coordinate of each pixel point is determined,is the first in the grey scale image of the egg areaiThe abscissa of each pixel point is given by its coordinate,is the ordinate of the reference pixel point in the gray level image of the egg area,the abscissa of a reference pixel point in the gray level image of the egg area is shown.
3. The poultry veterinary assistance system based on egg abnormality detection according to claim 1, characterized in that the method further comprises:
and if the eggs to be detected are not soft-shell eggs, performing edge detection on the texture feature images, and if an edge area is detected, determining that the eggs to be detected are pockmarked eggs and the edge area is the pockmarked area.
4. The poultry veterinary assistance system based on egg abnormality detection according to claim 1, characterized in that the method further comprises:
if the eggs to be detected are not soft-shell eggs, constructing a gray level co-occurrence matrix according to the gray level value of each pixel point in the gray level image of the egg area, and determining a contrast description operator and an energy description operator based on the gray level co-occurrence matrix;
comparing the contrast descriptor and the energy descriptor with the contrast descriptor standard value and the energy descriptor standard value respectively to obtain the eggshell roughness of the grey image of the egg area;
performing edge detection on the gray level image of the egg area to obtain an edge profile of the egg area, and performing circular fitting and rectangular fitting on the edge profile of the egg area respectively to obtain a fitted circle and a fitted rectangle;
determining a first shape index and a second shape index of the gray level image of the egg area according to the area of the edge contour of the egg area, the area of the fitting circle and the length and width of the fitting rectangle;
comparing the first shape index and the second shape index of the egg area gray level image with the first shape index standard value and the second shape index standard value respectively to obtain the eggshell malformation degree of the egg area gray level image;
according to the eggshell roughness of the egg area gray level image, whether the egg to be detected is an eggshell rough egg or not is judged, and whether the egg to be detected is an eggshell malformed egg or not is judged according to the eggshell malformation degree of the egg area gray level image.
5. The poultry veterinary assistance system based on egg abnormality detection according to claim 4, wherein the calculation formula for determining the correspondence of the first shape index and the second shape index of the gray scale image of the egg area is:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210683005.3A CN114782421B (en) | 2022-06-17 | 2022-06-17 | Poultry veterinarian auxiliary system based on egg laying abnormality detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210683005.3A CN114782421B (en) | 2022-06-17 | 2022-06-17 | Poultry veterinarian auxiliary system based on egg laying abnormality detection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114782421A CN114782421A (en) | 2022-07-22 |
CN114782421B true CN114782421B (en) | 2022-08-26 |
Family
ID=82420924
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210683005.3A Active CN114782421B (en) | 2022-06-17 | 2022-06-17 | Poultry veterinarian auxiliary system based on egg laying abnormality detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114782421B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115760884B (en) * | 2023-01-06 | 2023-04-14 | 山东恩信特种车辆制造有限公司 | Semitrailer surface welding slag optimization segmentation method based on image processing |
JP7349218B1 (en) * | 2023-05-29 | 2023-09-22 | 株式会社日本選別化工 | Egg surface inspection device |
CN117390089B (en) * | 2023-11-14 | 2024-03-19 | 河北玖兴农牧发展有限公司 | Pre-hatching egg fertilization information statistical method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000235005A (en) * | 1999-02-15 | 2000-08-29 | Nidec Tosok Corp | Egg inspection device |
CN108776143A (en) * | 2018-05-28 | 2018-11-09 | 湖北工业大学 | A kind of online vision inspection apparatus and method of the small stain of egg eggshell surface |
CN110991220A (en) * | 2019-10-15 | 2020-04-10 | 北京海益同展信息科技有限公司 | Egg detection method, egg image processing method, egg detection device, egg image processing device, electronic equipment and storage medium |
CN114529802A (en) * | 2022-01-26 | 2022-05-24 | 扬州大学 | Goose egg identification and positioning method and system based on machine vision |
-
2022
- 2022-06-17 CN CN202210683005.3A patent/CN114782421B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000235005A (en) * | 1999-02-15 | 2000-08-29 | Nidec Tosok Corp | Egg inspection device |
CN108776143A (en) * | 2018-05-28 | 2018-11-09 | 湖北工业大学 | A kind of online vision inspection apparatus and method of the small stain of egg eggshell surface |
CN110991220A (en) * | 2019-10-15 | 2020-04-10 | 北京海益同展信息科技有限公司 | Egg detection method, egg image processing method, egg detection device, egg image processing device, electronic equipment and storage medium |
CN114529802A (en) * | 2022-01-26 | 2022-05-24 | 扬州大学 | Goose egg identification and positioning method and system based on machine vision |
Non-Patent Citations (3)
Title |
---|
基于Adaboosting_SVM算法的多特征蛋壳裂纹识别;熊利荣等;《华中农业大学学报》;20150315(第02期);全文 * |
基于可见/近红外高光谱成像技术的鸡蛋新鲜度无损检测;杨晓玉等;《食品与机械》;20171128(第11期);全文 * |
基于短时傅立叶变换的鸡蛋破损检测技术的研究;平建峰等;《传感技术学报》;20090720(第07期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114782421A (en) | 2022-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114782421B (en) | Poultry veterinarian auxiliary system based on egg laying abnormality detection | |
CN116109644A (en) | Surface defect detection method for copper-aluminum transfer bar | |
CN116740070A (en) | Plastic pipeline appearance defect detection method based on machine vision | |
CN116310845B (en) | Intelligent monitoring system for sewage treatment | |
CN116337879B (en) | Rapid detection method for abrasion defect of cable insulation skin | |
CN116229335A (en) | Livestock and poultry farm environment recognition method based on image data | |
CN116071363B (en) | Automatic change shaped steel intelligent production monitoring system | |
CN116843692B (en) | Regenerated active carbon state detection method based on artificial intelligence | |
CN116485801A (en) | Rubber tube quality online detection method and system based on computer vision | |
CN115170567A (en) | Method for detecting defects of waterproof steel plate for ship | |
CN117934460B (en) | Intelligent detection method for surface defects of insulating plate based on visual detection | |
CN117893457B (en) | PCB intelligent detection method, device and computer equipment | |
CN117830300B (en) | Visual-based gas pipeline appearance quality detection method | |
CN117218115B (en) | Auto part paint surface abnormality detection method | |
CN117705815B (en) | Printing defect detection method based on machine vision | |
CN116577345B (en) | Method and system for detecting number of tabs of lithium battery | |
CN117078678B (en) | Waste silicon wafer shape detection method based on image recognition | |
CN115861318B (en) | Cotton processing production quality detection method | |
CN117388263A (en) | Hardware terminal quality detection method for charging gun | |
CN117274405A (en) | LED lamp working color detection method based on machine vision | |
CN111126371B (en) | Coarse pointer dial reading method based on image processing | |
CN116205923A (en) | Nondestructive testing method for internal defects of automobile hub based on X-RAY | |
CN116993717A (en) | Visual detection method for production quality of heating wire of electronic cigarette | |
CN115205317B (en) | Bridge monitoring photoelectric target image light spot center point extraction method | |
CN116958134B (en) | Plastic film extrusion quality evaluation method based on image processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |