CN117808806B - Feed production quality refinement detection method based on image feature analysis - Google Patents

Feed production quality refinement detection method based on image feature analysis Download PDF

Info

Publication number
CN117808806B
CN117808806B CN202410224600.XA CN202410224600A CN117808806B CN 117808806 B CN117808806 B CN 117808806B CN 202410224600 A CN202410224600 A CN 202410224600A CN 117808806 B CN117808806 B CN 117808806B
Authority
CN
China
Prior art keywords
feed
edge
rough
point
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410224600.XA
Other languages
Chinese (zh)
Other versions
CN117808806A (en
Inventor
姜建宏
李海涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Demuxirui Biotechnology Tianjin Co ltd
Original Assignee
Demuxirui Biotechnology Tianjin Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Demuxirui Biotechnology Tianjin Co ltd filed Critical Demuxirui Biotechnology Tianjin Co ltd
Priority to CN202410224600.XA priority Critical patent/CN117808806B/en
Publication of CN117808806A publication Critical patent/CN117808806A/en
Application granted granted Critical
Publication of CN117808806B publication Critical patent/CN117808806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of edge detection, in particular to a feed production quality refinement detection method based on image feature analysis. According to the method, initial homoplanar probability of mutation points is obtained according to gray distribution differences in two sub-areas under the mutation points in a rough feed area in a feed gray image, initial homoplanar probability of all the mutation points in the same rough feed area is synthesized, final homoplanar probability of the rough feed area is obtained, and edge responsivity of pixel points is adjusted by utilizing the obtained main surface probability in combination with position distribution analysis between the shape of the rough feed area and the edge pixel points on the edge, so that the edge of a feed main body is obtained, and the production quality of the feed is finely detected based on the length of the main surface probability. The invention reduces the edge responsivity of the side edge of the feed by utilizing the possibility of the main body surface, reduces the possibility of false identification of the edge of the feed main body, and improves the accuracy of refined detection of the feed production quality.

Description

Feed production quality refinement detection method based on image feature analysis
Technical Field
The invention relates to the technical field of edge detection, in particular to a feed production quality refinement detection method based on image feature analysis.
Background
The quality of the feed is an important component in livestock and poultry production, and has direct and profound effects on the production performance, health condition and final product quality of the livestock and poultry. Therefore, it is very important to perform fine detection on the quality of feed production.
The current standard for the feed refinement detection is to analyze the size degree of feed particles through research, and particularly judge the size degree through the length of the edges of the feed main body. In the prior art, the edges of the main body surface of the feed in the feed image to be detected are obtained through a Hessen matrix method, the feed is randomly stacked and placed when the feed is subjected to fine detection, the edges of the side surface of the feed are obvious under the influence of illumination, and in the process of obtaining the edges of the main body surface of the feed through the Hessen matrix method, the edges of the side surface of the feed are mistakenly identified as the edges of the main body surface of the feed, so that the accuracy of fine detection of the production quality of the feed is reduced.
Disclosure of Invention
In order to solve the technical problem that the accuracy of the fine detection of the production quality of the feed is reduced due to the fact that the edge of the side face of the feed is mistakenly identified as the edge of the main body of the feed, the invention aims to provide the fine detection method of the production quality of the feed based on image feature analysis, and the adopted technical scheme is as follows:
the invention provides a feed production quality refined detection method based on image feature analysis, which comprises the following steps:
Acquiring a feed gray image of a feed to be tested;
Dividing the feed gray level image into different feed rough areas; dividing each feed rough region into a first sub-region and a second sub-region under each mutation point based on the position of each mutation point in each feed rough region; acquiring the initial homoplanar probability of each mutation point in each feed rough region according to the difference between the gray level distribution of the pixel points in the first sub-region and the second sub-region of each feed rough region under the same mutation point;
Acquiring the final co-planar probability of each feed rough region according to the initial co-planar probability of the mutation point in each feed rough region;
combining the shape of each feed rough region with the position distribution between the edge pixel points on the edge and the final homofacial probability to obtain the main facial probability of each feed rough region;
Acquiring the edge responsivity of each pixel point in the feed gray level image by utilizing a hessian matrix method; adjusting the edge responsivity based on the main body surface probability of the rough feed region where the pixel points in the feed gray level image are positioned, and obtaining the edge of the feed main body in the feed gray level image;
and carrying out fine detection on the production quality of the feed according to the length of the feed edge in the feed gray level image.
Further, the method for dividing the feed gray level image into different feed rough areas comprises the following steps:
And acquiring a gradient image of the feed gray level image, dividing the gradient image by using a watershed algorithm, and taking corresponding areas of different obtained areas in the feed gray level image as rough feed areas.
Further, the method for dividing each rough feed region into a first sub-region and a second sub-region under each mutation point based on the position of each mutation point in each rough feed region comprises the following steps:
for each feed rough region, acquiring gradient values of all pixel points in the feed rough region, and taking the pixel point corresponding to the maximum gradient value as a mutation point of the feed rough region;
For each mutation point of the rough feed area, acquiring a target straight line passing through the mutation point, wherein the direction of the target straight line is the same as the preset direction; if the abrupt change point belongs to the edge pixel point on the edge of the feed rough area, taking a line segment between the edge pixel point corresponding to the intersection point of the target straight line and the edge of the feed rough area and the abrupt change point as a demarcation line segment corresponding to the abrupt change point; if the abrupt point does not belong to the edge pixel point on the edge of the feed rough area, taking a line segment between the target straight line and the edge pixel points corresponding to two intersection points of the edge of the feed rough area as a demarcation line segment corresponding to the abrupt point;
dividing the feed rough region into a first sub-region and a second sub-region under each mutation point based on the demarcation line segment of each mutation point in the feed rough region.
Further, the method for obtaining the initial homoplanar probability of each mutation point in each feed rough region according to the difference between the gray level distribution of the pixel points in the first sub-region and the second sub-region of each feed rough region under the same mutation point comprises the following steps:
Taking a first subarea and a second subarea of each feed rough area under each mutation point as analysis subareas;
For each analysis subarea, taking the average value of the gray values of all pixel points in the analysis subarea as the gray comprehensive value of the analysis subarea;
Taking the absolute value of the difference between the gray value of each pixel point in the analysis subarea and the gray comprehensive value as the gray discrete value of each pixel point in the analysis subarea; numbering the pixel points in the analysis subarea, and taking the absolute value of the difference value between the gray values of the pixel points of the next number of the corresponding number of each pixel point in the analysis subarea as the gray difference value of each pixel point in the analysis subarea;
acquiring a comprehensive change value of each pixel point in the analysis subarea according to the gray discrete value and the gray difference value; the gray discrete value and the gray difference value are in positive correlation with the comprehensive change value;
And combining the difference between the gray scale integrated values of the first subarea and the second subarea of each feed rough area under each mutation point and the difference between the integrated change values to obtain the initial coplanarity probability of each mutation point in each feed rough area.
Further, the calculation formula of the initial coplanarity probability of each mutation point in each feed rough region is as follows:
; in the/> The initial homoplanar probability of the jth mutation point in each feed rough region; /(I)The gray scale integrated value of the first subarea of each feed rough area under the j-th mutation point; /(I)The gray scale integrated value of the second subarea of each feed rough area under the j-th mutation point; m1 is the total number of pixel points in the first sub-area of each feed rough area under the j-th mutation point; /(I)The comprehensive change value of the (m 1) th pixel point in the first sub-area under the j-th mutation point in each feed rough area; /(I)The total number of pixel points in the second sub-area under the j-th mutation point in each feed rough area; The comprehensive change value of the m < 2 > pixel point in the second sub-area under the j-th mutation point in each feed rough area; /(I) As an absolute function; exp is an exponential function based on a natural constant e.
Further, the calculation formula of the final co-planar probability of each feed rough region is as follows:
; wherein W is the final homofacial probability of each feed rough region; j is the total number of mutation points in each coarse feed region; /(I) The initial homoplanar probability of the jth mutation point in each feed rough region; /(I)For the difference between the maximum value of the initial coplanarity probabilities of all the mutant points and the initial coplanarity probability of the jth mutant point in each coarse feed region; /(I)To be within each rough feed region, the difference between the initial co-planar likelihood of the jth mutation point and the minimum of the initial co-planar likelihoods of all mutation points.
Further, the method for obtaining the main surface likelihood of each feed rough region by combining the shape of each feed rough region and the position distribution between the edge pixel points on the edge and the final co-surface likelihood comprises the following steps:
For each feed rough region, selecting any one edge pixel point on the edge of the feed rough region as an analysis pixel point, calculating Euclidean distances between the analysis pixel point and each of the rest edge pixel points except the analysis pixel point on the edge of the feed rough region, and taking the smallest Euclidean distance as a length value of the analysis pixel point;
Taking the variance of the length value of the edge pixel point on the edge of the feed rough region as the rectangular feature degree of the feed rough region;
For each edge pixel point on the edge of the feed rough area, marking the pixel point in a preset neighborhood range of the edge pixel point to obtain a marking value of each pixel point in the preset neighborhood range of the edge pixel point; on the edge of the feed rough area, taking the next adjacent edge pixel point of each edge pixel point as a judging edge pixel point of each edge pixel point, and taking the label value of the judging edge pixel point in the preset neighborhood range of each edge pixel point as the relative position value of each edge pixel point on the edge of the feed rough area;
and combining the difference between the relative position values of the adjacent edge pixel points on the edge of each feed rough area, the rectangular feature degree and the final co-planar probability degree to obtain the main body co-planar probability degree of each feed rough area.
Further, the calculation formula of the main surface probability of each feed rough area is as follows:
; wherein R is the main body surface probability of each feed rough area; /(I) The rectangular feature degree of each feed rough area; b is the total number of edge pixel points on the edge of each feed rough area; /(I)The relative position value of the b-th edge pixel point on the edge of each feed rough area is calculated; /(I)The relative position value of the (b+1) th edge pixel point on the edge of each feed rough area; w is the final homotopy probability of each feed rough region; /(I)Is a normalization function; norm is the normalization function; exp is an exponential function based on a natural constant e.
Further, the method for obtaining the edge of the fodder main body in the fodder gray level image based on the main body surface probability of the fodder rough area where the pixel point in the fodder gray level image is located adjusts the edge responsiveness, includes:
Taking the product of the main surface probability of the rough feed region where each pixel point in the feed gray image is located and the edge responsivity as the edge adjustment responsivity of each pixel point in the feed gray image;
Taking the pixel point with the edge adjustment responsivity larger than a preset standard threshold value as a main body edge pixel point; and taking the connected domain formed by the pixel points at the edge of the main body as the edge of the main body of the feed.
Further, the method for finely detecting the production quality of the feed according to the length of the feed edge in the feed gray level image comprises the following steps:
Acquiring corner points on the edge of each feed main body in the feed gray level image; dividing the feed body edge into different sub-body edges for each feed body edge based on all corner points; counting the total number of pixel points on the edge of the sub-main body as the edge length of the edge of the sub-main body;
Taking the maximum value of the edge length of the sub-main body edge corresponding to the feed main body edge as the initial reference length of the feed main body edge; normalizing all the initial reference lengths to obtain the final reference length of the edge of the feed main body;
if the final reference length is within the preset standard length interval, the corresponding feed main body edge is taken as the feed standard main body edge;
Taking the ratio of the total number of the standard main body edges of the feed to the total number of the main body edges of the feed in the feed gray level image as a fine judgment value; when the fine judgment value is larger than or equal to a preset fine threshold value, the feed to be tested accords with the fine standard; and when the fine judgment value is smaller than a preset fine threshold value, the feed to be tested does not accord with the fine standard.
The invention has the following beneficial effects:
In the embodiment of the invention, in order to analyze the edge of the main body surface of the feed conveniently, a feed gray level image is divided into a feed rough region, abrupt points are pixel points with severe gray level change, and the feed rough region is divided into two sub-regions based on the abrupt points in order to analyze the possibility that the feed rough region is a mixed region of the main body surface and the side surface of the feed; because the gray scale expression of each feed is inconsistent with the gray scale expression of different surfaces of the same feed, the gray scale expression difference degree of different subregions corresponding to the mutation points reflects the possibility degree that the rough feed region is the same feed surface, the initial homonymy possibility degree is obtained according to the difference degree, the initial homonymy possibility degree of the mutation points in the rough feed region is further synthesized for analysis, and the accuracy of the follow-up judgment of the rough feed region is improved; the shape of the rough feed area and the position distribution between the edge pixel points on the edge of the rough feed area reflect the shape condition of the rough feed area in sequence from the whole and local aspects, and the final homofacial probability analysis of the homofacial condition is combined, so that the rough feed area presents a main body surface state, namely the main body surface probability is more convincing; when the feed main body edge in the feed gray level image is obtained by utilizing the Hessen matrix method, the side edge of the feed is identified, so that the influence of the side edge of the feed on the obtained feed main body edge is reduced, the edge responsivity of the pixel point is adaptively adjusted by utilizing the main body surface possibility degree reflecting the main body surface condition of the feed, the distinguishing degree between the edge of the main body surface of the feed and the edge of the side surface is improved, the possibility that the side edge of the feed is mistakenly identified is reduced, the credibility of identifying the main body edge of the feed is increased, and the accuracy of the fine detection of the feed production quality is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for detecting the quality of feed production based on image feature analysis according to an embodiment of the present invention;
Fig. 2 is a schematic diagram of a gray scale image of a feed to be tested according to an embodiment of the present invention.
Detailed Description
In order to further explain the technical means and effects adopted by the invention to achieve the preset aim, the following is a detailed detection method for the production quality of the feed based on image feature analysis, which is provided by the invention, with reference to the accompanying drawings and the preferred embodiment, and the detailed implementation, structure, features and effects thereof are as follows. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention aims at the specific scene: aiming at scattered and overlapped feed, the side surface of the feed can interfere with the main body surface during fine analysis and detection, and the side surface is mistakenly considered as a incomplete part of the main body surface of the feed, so that the final fine detection result of the feed is inaccurate.
The invention provides a specific scheme of a feed production quality refined detection method based on image feature analysis, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a method for detecting quality refinement of feed production based on image feature analysis according to an embodiment of the present invention is shown, where the method includes:
step S1: and acquiring a feed gray image of the feed to be tested.
Specifically, the feeds to be detected which need to be subjected to fine detection are scattered and stacked, a light source is in the horizontal right direction, and a high-definition camera is used for shooting the feeds to be detected, so that an original image of the feeds is obtained; graying treatment is carried out on the feed original image to obtain a feed gray image; fig. 2 is a schematic diagram of a gray scale image of a feed to be tested according to an embodiment of the present invention.
It should be noted that, in the embodiment of the present invention, the weighted average graying algorithm is selected to perform graying processing, and a specific method is not described herein, which is a technical means well known to those skilled in the art. Other image capturing devices and image preprocessing algorithms, which are well known to those skilled in the art, may be used in other embodiments of the present invention, and are not limited herein.
Step S2: dividing the feed gray level image into different feed rough areas; dividing each feed rough region into a first sub-region and a second sub-region under each mutation point based on the position of each mutation point in each feed rough region; and acquiring the initial homoplanar probability of each mutation point in each feed rough region according to the difference between the gray level distribution of the pixel points in the first sub-region and the second sub-region of each feed rough region under the same mutation point.
Because the feeds are randomly stacked and placed, the feeds are not completely and tightly distributed, a certain gap exists between the feeds, and the gap is expressed as low gray in the gray image of the feeds. Meanwhile, the gray scale of each feed under the fixed light source is inconsistent; the angle between the main surface and the side surface of the same feed is different, so that the gray scale performance of different surfaces of the same feed is also inconsistent. The gradient value of the boundary position of different gray scale expressions of different feeds and the boundary position of different surfaces of the same feed is larger, and the gray scale image of the feed is roughly divided to obtain a rough feed area for the convenience of analysis.
Preferably, the specific acquisition method of the rough feed area comprises the following steps: and (3) acquiring a gradient image of the feed gray level image, dividing the gradient image by using a watershed algorithm, and taking the corresponding areas of the different obtained areas in the feed gray level image as rough feed areas.
Because the difference of gray scale expressions of different surfaces of the same feed is caused by illumination, compared with the gradient value of the pixel point at the boundary position of different gray scale expressions of different feeds, the gradient value of the pixel point at the boundary position of different surfaces of the same feed is smaller, so that the rough feed area can be the main body surface of the feed, the side surface of the feed and the mixed area of the main body surface and the side surface of the feed.
In the embodiment of the invention, the gradient value of each pixel point in the feed gray image is obtained by using the Canny operator, and the gradient value of the pixel point in the feed gray image forms the gradient image of the feed gray image. The region obtained by dividing the gradient image by using the watershed algorithm is a rough region of the feed in the gradient image, and the gray level image of the feed corresponds to the rough region of the feed in the gradient image one by one. The watershed algorithm and the Canny operator are well known to those skilled in the art, and are not described herein.
In order to analyze the possibility that the rough feed area is a mixed area of a main body surface and a side surface of the feed, dividing the rough feed area based on pixel points with larger gradient values to obtain sub-areas.
Preferably, the dividing method of the feed rough region under each mutation point comprises the following steps: for each feed rough region, gradient values of all pixel points in the feed rough region are obtained, and the pixel point corresponding to the maximum gradient value is used as a mutation point of the feed rough region; for each mutation point of the rough feed area, acquiring a target straight line passing through the mutation point, wherein the direction of the target straight line is the same as the preset direction; if the abrupt change point belongs to the edge pixel point on the edge of the feed rough area, taking a line segment between the edge pixel point corresponding to the intersection point of the target straight line and the edge of the feed rough area and the abrupt change point as a demarcation line segment corresponding to the abrupt change point; if the abrupt point does not belong to the edge pixel point on the edge of the feed rough area, taking a line segment between the target straight line and the edge pixel points corresponding to two intersection points of the edge of the feed rough area as a demarcation line segment corresponding to the abrupt point; dividing the feed rough region into a first sub-region and a second sub-region under each mutation point based on the demarcation line segment of each mutation point in the feed rough region.
Because the gradient value of the pixel points at the boundary position of different gray scale expressions of different feeds and the boundary position of different surfaces of the same feed is larger, the purpose of dividing the rough feed area based on the pixel point with the largest gradient value in the rough feed area, namely the abrupt point, is to accurately divide the rough feed area containing the main body surface and the side surface of the feed into the main body surface and the side surface as much as possible. There may be a plurality of pixels with the largest gradient value in the rough feed region, and at least one mutation point exists in the rough feed region.
It should be noted that, in the embodiment of the present invention, the direction of the light source is horizontal to the right, so that the main surface and the side surface of the same feed are generally distributed left and right, and the preset direction is set to be perpendicular to the direction of the light source. The edge of the feed rough region is a communication region formed by pixel points at the outermost side of the feed rough region; if the edge of the rough feed area is a closed curve, if the abrupt point does not belong to the edge pixel point on the edge of the rough feed area, the target line segment corresponding to the abrupt point and the edge of the rough feed area only have two intersection points because the main body surface and the side surface of the feed are in regular shapes. If the target straight line corresponding to the mutation point has no intersection point with the edge of the rough feed region, the mutation point does not need to be subjected to the step and subsequent analysis.
Because the gray scale expression of each feed is inconsistent, the gray scale expression of different surfaces of the same feed is also inconsistent, and the gray scale expression difference degree of different subareas corresponding to the mutation points reflects the possibility degree that the rough feed area is the same feed surface, so that the initial same-surface possibility degree is obtained.
Preferably, the specific acquisition method of the initial co-planar probability degree is as follows: taking a first subarea and a second subarea of each feed rough area under each mutation point as analysis subareas; for each analysis subarea, taking the average value of the gray values of all pixel points in the analysis subarea as the gray comprehensive value of the analysis subarea; taking the absolute value of the difference between the gray value and the gray comprehensive value of each pixel point in the analysis subarea as the gray discrete value of each pixel point in the analysis subarea; numbering the pixel points in the analysis subarea, and taking the absolute value of the difference value between the gray values of the pixel points of the next number of the corresponding number of each pixel point in the analysis subarea as the gray difference value of each pixel point in the analysis subarea; acquiring a comprehensive change value of each pixel point in the analysis subarea according to the gray discrete value and the gray difference value; the gray discrete value and the gray difference value are in positive correlation with the comprehensive change value; and combining the difference between the gray scale integrated values of the first subarea and the second subarea of each feed rough area under each mutation point and the difference between the integrated change values to obtain the initial homotopy probability of each mutation point in each feed rough area.
In the embodiment of the invention, the number of the pixels in the analysis subarea is numbered from the pixel in the upper left corner of the analysis subarea, that is, the number of the first pixel in the first row, which is the upper left corner of the analysis subarea, is 1, and the number of the pixels in the first row is sequentially increased from left to right; the second row starts from the last pixel point, and the sequence numbers of the pixel points in the second row are sequentially increased from right to left; if the serial number of the last pixel point in the first row is 10, the serial number of the last pixel point in the second row is 11, the serial number of the last pixel point in the second row is 12, and so on; the third row starts from the first pixel point, the serial numbers of the pixel points in the third row are sequentially increased from left to right, and if the serial number of the first pixel point in the second row is 22, the serial number of the first pixel point in the third row is 23; and so on, numbering each pixel point in the analysis sub-area. The number of the pixel points in the analysis subarea corresponds to the number of the pixel points one by one, for example, the 10 th pixel point in the analysis subarea is the pixel point with the number of 10.
It should be noted that the pixels in the analysis sub-region do not include the pixels on the corresponding target line segment.
The calculation formula of the comprehensive change value of each pixel point in the analysis subarea is as follows:
Wherein ZB is the comprehensive change value of each pixel point in the analysis subarea; the gray value of the mth pixel point in the sub-area is analyzed; /(I) The gray value of the (m+1) th pixel point in the sub-area is analyzed; /(I)To analyze the gray scale integrated value of the subarea; /(I)The gray level difference value of the mth pixel point in the sub-area is analyzed; /(I)The gray level discrete value of the mth pixel point in the analysis subarea is obtained; /(I)As a function of absolute value.
In order to accurately study the gray scale conditions of the pixel points in the analysis subarea, the gray scale difference of the adjacent pixel points and the gray scale value mean value of the pixel points in the analysis subarea in the embodiment of the inventionAnalysis was performed in two ways; when the difference between the gray value of each pixel point and the gray value of the next pixel point is/>The larger, and the difference between each pixel point and the gray average value in the region is/>The larger the gray scale variation of the pixel points in the analysis sub-area is, the larger the comprehensive variation value ZB is.
Combining the difference between the gray scale integrated values of the first subarea and the second subarea of each feed rough area under each mutation point and the difference between the integrated change values to obtain the initial homonymy probability of each mutation point in each feed rough area; the calculation formula of the initial co-planar probability is as follows:
in the method, in the process of the invention, The initial homoplanar probability of the jth mutation point in each feed rough region; /(I)The gray scale integrated value of the first subarea of each feed rough area under the j-th mutation point; /(I)The gray scale integrated value of the second subarea of each feed rough area under the j-th mutation point; m1 is the total number of pixel points in the first sub-area of each feed rough area under the j-th mutation point; /(I)The comprehensive change value of the (m 1) th pixel point in the first sub-area under the j-th mutation point in each feed rough area is calculated; /(I)The total number of pixel points in the second sub-area under the j-th mutation point in each feed rough area; /(I)The comprehensive change value of the m < 2 > pixel point in the second sub-area under the j-th mutation point in each feed rough area is calculated; /(I)As an absolute function; exp is an exponential function based on a natural constant e.
Is a reference index for judging the difference between the gray level distribution of the pixel points in the first sub-area and the second sub-area corresponding to the jth mutation point in the rough feed area, and when/>The smaller the j-th mutation point is, the higher the gray level consistency of the first subarea and the second subarea is, which means that the greater the possibility that the two subareas are the main surface or the side surface of the same feed, the initial homotopy probability/>The larger; when/>The larger the gray scale difference between the first subarea and the second subarea corresponding to the jth mutation point is, the larger the gray scale difference between the first subarea and the second subarea is, which shows that one of the two subareas is a main feed surface, the larger the possibility that the other subarea is a side feed surface is, the initial homotopy possibility/>The smaller.
Presenting the gray level difference degree of the pixel points in the first sub-area and the second sub-area corresponding to the jth mutation point in the feed rough area, and when/>The smaller the gray value of the whole pixel point in the two sub-areas is, the more similar the gray value of the whole pixel point in the two sub-areas is, the greater the possibility that the two sub-areas are the main surface or the side surface of the same feed is, and the initial homotopy probability/>The larger. /(I)Gray level difference degree of first subarea and second subarea corresponding to overall showing abrupt point,/>The gray level difference degree of the first subarea and the second subarea corresponding to the mutation point is presented from the gray level condition of the local single pixel point, and the two subareas are comprehensively analyzed, so that the initial homography degree is more reliable.
Step S3: and acquiring the final co-planar probability of each feed rough region according to the initial co-planar probability of the mutation point in each feed rough region.
The initial homotopy probability degree shows that two sub-areas corresponding to each mutation point in the feed rough area are homotopy degrees of the feed, and analysis is performed by integrating the initial homotopy probability degree of all the mutation points in the feed rough area, so that the accuracy of the follow-up judgment of the feed rough area is improved.
The initial homoplanar probability of the mutation points in each feed rough region is obtained; the final homography probability is calculated as follows:
Wherein W is the final homofacial probability of each feed rough region; j is the total number of mutation points in each coarse feed region; The initial homoplanar probability of the jth mutation point in each feed rough region; /(I) For the difference between the maximum value of the initial coplanarity probabilities of all the mutant points and the initial coplanarity probability of the jth mutant point in each coarse feed region; /(I)To be within each rough feed region, the difference between the initial co-planar likelihood of the jth mutation point and the minimum of the initial co-planar likelihoods of all mutation points.
The dynamic adjustment of the initial co-planar probability is realized, the initial co-planar probability with larger numerical value is adjusted to be larger, and the initial co-planar probability with smaller numerical value is adjusted to be smaller; the greater the final co-planar likelihood W, the greater the degree of accuracy with which the rough feed region is the major feed face or side of the feed.
The specific analysis is as follows: initial in-plane likelihood of the jth mutation point in the coarse fodder regionThe smaller the first aspect is/>The larger is, make/>Approaching 0; second aspect/>The smaller is, make/>Get close to 1, leading to/>Approaching 0. Initial in-plane likelihood/>, when the jth mutation point in the rough feed regionThe larger the first aspect is/>The smaller is, make/>Approaching 1; second aspect/>The larger is so thatGet close to 0, leading to/>Approaching 2.
Step S4: and combining the shape of each feed rough region with the position distribution between the edge pixel points on the edge and the final co-planar probability, and acquiring the main body planar probability of each feed rough region.
Specifically, the main surface of the feed is approximately rectangular, and the side surface is approximately circular; the shape of the region reflects the shape of the rough region of the feed from the whole, the position distribution among the edge pixel points on the edge of the rough region of the feed reflects the shape of the rough region of the feed from the part, the two are analyzed comprehensively, and the final co-planar probability of presenting the co-planar condition of the rough region of the feed is simultaneously analyzed, so that the rough region of the feed presents the state of the main body surface, namely the main body surface probability is more convincing.
Preferably, the method for obtaining the likelihood of the main body surface is as follows: for each feed rough region, selecting any one edge pixel point on the edge of the feed rough region as an analysis pixel point, calculating Euclidean distances between the analysis pixel point and each of the rest edge pixel points except the analysis pixel point on the edge of the feed rough region, and taking the minimum Euclidean distance as a length value of the analysis pixel point; taking the variance of the length value of the edge pixel point on the edge of the feed rough region as the rectangular feature degree of the feed rough region; for each edge pixel point on the edge of the feed rough area, marking the pixel point in a preset neighborhood range of the edge pixel point to obtain a marking value of each pixel point in the preset neighborhood range of the edge pixel point; on the edge of the feed rough area, taking the next adjacent edge pixel point of each edge pixel point as a judging edge pixel point of each edge pixel point, and taking the label value of the judging edge pixel point in the preset neighborhood range of each edge pixel point as the relative position value of each edge pixel point on the edge of the feed rough area; and combining the difference between the relative position values of the adjacent edge pixel points on the edge of each feed rough region, the rectangular feature degree and the final co-planar probability degree to obtain the main body co-planar probability degree of each feed rough region.
It should be noted that, any one of the edge pixel points on the edge of the feed rough area is selected as the first edge pixel point on the edge of the feed rough area, and the number sequence of the edge pixel points on the edge of the feed rough area is sequentially increased along the anticlockwise direction. In the embodiment of the invention, the preset neighborhood is eight neighborhood, for the pixel points in the eight neighborhood range of each edge pixel point on the edge of the feed rough area, the label value of the pixel point in the horizontal right direction of the edge pixel point is 0, the pixel points rotate once along the anticlockwise direction, the label values of the pixel points in the eight neighborhood of the edge pixel point are 0, 1,2, 3, 4, 5, 6 and 7 in sequence, the label value of the pixel point in the upper right direction of the edge pixel point is 1, and the label value of the pixel point in the upper right direction of the edge pixel point is 2, and the like. On the edge of the feed rough region, if the (b+1) th edge pixel point, namely the judging edge pixel point of the (b) th edge pixel point is in the right upper direction in the eight neighborhood range of the (b) th edge pixel point, the relative position value of the (b) th edge pixel point is equal to 2; and acquiring the relative position value of each edge pixel point on the edge of the rough feed area according to the method. Because the edge of the feed rough area is a closed edge, each pixel point on the edge of the feed rough area has a corresponding relative position value. It should be noted that the last edge pixel point on the edge of the rough feed area is the first edge pixel point.
The calculation formula of the main surface probability of each feed rough area is as follows:
Wherein R is the main body surface probability of each feed rough area; rectangular feature degree of each feed rough area; b is the total number of edge pixel points on the edge of each feed rough area; /(I) The relative position value of the b-th edge pixel point on the edge of each feed rough area is calculated; /(I)The relative position value of the (b+1) th edge pixel point on the edge of each feed rough area; w is the final homotopy probability of each feed rough region; /(I)Is a normalization function; norm is the normalization function; exp is an exponential function based on a natural constant e.
The main surface of the feed is approximately rectangular, and the side surface is approximately round; in the embodiment of the invention, the rectangular feature degree is adoptedThe shape characteristics of the rough area of the feed were measured. When/>When the feed is smaller, the length values of all edge pixel points on the edge of the feed rough area are closer, the feed rough area accords with the circular characteristic of the feed side surface but does not accord with the rectangular characteristic of the main body surface, and the main body surface probability R is smaller; when/>When the feed is bigger, the length value of the edge pixel point on the edge of the feed rough area is more discrete, the feed rough area is more consistent with the rectangular characteristic of the main body surface of the feed side, and the main body surface probability R is greater.
The difference between the main body surface and the side surface is reflected by the difference between the relative position values of the edge pixel points on the edge of the feed rough area. The main body surface of the feed is approximately rectangular, and the relative position values of the edge pixel points on each side edge of the main body surface are basically equal; the more likely that the relative position values of adjacent edge pixels on the edges of the side will differ is the side being approximately circular. When/>When the feed rough area is larger, the round characteristics of the feed side are more consistent, the probability that the feed rough area is the main body surface of the feed is smaller, and the probability of the main body surface is smaller; when/>The smaller the feed rough area is, the more likely the feed rough area is to be the feed body surface, and the greater the body surface likelihood R is.
When W is larger, the probability that the rough feed area is the same main body surface and side surface of the feed is larger; at the same time considerWhen/>The greater W is the same time, the greater the likelihood that the rough feed area is the feed body surface, the greater the body surface likelihood R.
Step S5: acquiring the edge responsivity of each pixel point in the feed gray level image by utilizing a hessian matrix method; and adjusting edge responsivity based on the main body surface probability of the rough feed region where the pixel points in the feed gray level image are positioned, and obtaining the edge of the feed main body in the feed gray level image.
When the feed main body edge in the feed gray level image is acquired by utilizing the Hessen matrix method, the side edge of the feed is identified, and in order to reduce the influence of the side edge of the feed on the acquisition of the feed main body edge, the edge responsivity of the pixel points is adjusted by utilizing the main body surface probability reflecting the condition of the main body surface of the feed, the distinguishing degree between the edge of the main body surface of the feed and the edge of the side surface is improved, and the possibility of misrecognition of the side edge of the feed is reduced.
And obtaining the edge responsivity of each pixel point in the feed gray level image by utilizing a hessian matrix method. The specific process is as follows: calculating the gradient value of each pixel point in the feed gray level image in the x direction and the gradient value in the y direction; performing second derivative operation on the gradient values to obtain a hessian matrix of each pixel point in the feed gray level image; and acquiring the edge responsivity of each pixel point according to the hessian matrix. The calculation formula of the edge responsivity of each pixel point in the feed gray level image is as follows:
wherein R is the edge responsivity of each pixel point in the feed gray level image; the method comprises the steps of (1) setting a first eigenvalue of a hessian matrix for each pixel point in a feed gray level image; /(I) The second eigenvalue of the hessian matrix of each pixel point in the feed gray level image; k is a preset constant, usually takes a smaller value, and in the embodiment of the invention, takes an empirical value of 1, and can be set by an implementer according to specific situations.
It should be noted that, the method for obtaining the hessian matrix of the pixel point in the feed gray level image and the characteristic value thereof by using the hessian matrix method and the method for obtaining the edge responsivity of the pixel point by using the hessian matrix method are all known techniques, and the specific obtaining process is not described herein.
And taking the product of the main body surface probability and the edge responsivity of the rough feed region where each pixel point in the feed gray level image is positioned as the edge adjustment responsivity of each pixel point in the feed gray level image. The edge responsivity of the pixel points at the edge positions is larger, the edge responsivity of the pixel points in the surface of the feed is lower, the main surface probability of the pixel points in the surface of the feed is utilized to adjust the edge responsivity, the change is smaller, and the recognition of the pixel points at the edge positions is not influenced.
Taking the pixel point with the edge adjustment responsivity larger than a preset standard threshold value as a main body edge pixel point; and taking the connected domain formed by the pixel points at the edge of the main body as the edge of the main body of the feed.
In the embodiment of the invention, the preset standard threshold takes the empirical value of 0.6, and an implementer can set the standard threshold according to specific conditions.
Step S6: and carrying out fine detection on the production quality of the feed according to the length of the feed edge in the feed gray level image.
Acquiring corner points on the edge of each feed main body in the feed gray level image; dividing the feed body edge into different sub-body edges for each feed body edge based on all corner points; counting the total number of pixel points on the edge of the sub-main body as the edge length of the edge of the sub-main body; taking the maximum value of the edge length of the sub-main body edge corresponding to the edge of the feed main body as the initial reference length of the edge of the feed main body; and normalizing all the initial reference lengths to obtain the final reference length of the edge of the feed main body.
In the embodiment of the invention, the long side of the main feed surface is used for judging the feed specification because the main feed surface is approximately rectangular, and the length of the long side of the main feed surface is equal to the initial reference length. Since the feed body edge is a closed edge, the number of corner points on the feed body edge is equal to the number of sub-body edges into which the feed body edge is divided.
In the embodiment of the invention, harris corner detection is selected to obtain the corner point on the edge of the feed main body; and carrying out normalization processing on the initial reference length by using a maximum and minimum normalization method, wherein the Harris corner detection and the maximum and minimum normalization method are known techniques and are not repeated here.
If the final reference length is within the preset standard length interval, the corresponding feed main body edge is taken as the feed standard main body edge; taking the ratio of the total number of the standard main body edges of the feed to the total number of the main body edges of the feed in the feed gray level image as a fine judgment value; when the fine judgment value is larger than or equal to a preset fine threshold value, the feed to be tested accords with the fine standard; when the fine judgment value is smaller than a preset fine threshold value, the feed to be tested does not accord with the fine standard.
In the embodiment of the invention, the preset standard length interval takes the empirical valueThe preset fine threshold takes an experience value of 0.8, and an implementer can set the fine threshold according to specific conditions.
The present invention has been completed.
In summary, in the embodiment of the invention, according to the gray distribution difference between the first sub-region and the second sub-region under the abrupt change points in the rough feed region in the gray image of the feed, the initial coplanarity probability of the abrupt change points in the rough feed region is obtained, the initial coplanarity probability of all the abrupt change points in the same rough feed region is synthesized, the final coplanarity probability of the rough feed region is obtained, the shape of the rough feed region and the position distribution analysis between the edge pixels on the edge are combined, the edge responsivity of the pixels is adjusted by using the obtained main surface probability, the edge of the main feed is obtained, and the fine detection of the production quality of the feed is performed based on the length of the main surface probability. The invention reduces the edge responsivity of the side edge of the feed by utilizing the possibility of the main body surface, reduces the possibility of false identification of the edge of the feed main body, and improves the accuracy of refined detection of the feed production quality.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. The processes depicted in the accompanying drawings do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.

Claims (8)

1. A method for detecting the production quality of feed based on image feature analysis is characterized by comprising the following steps:
Acquiring a feed gray image of a feed to be tested;
Dividing the feed gray level image into different feed rough areas; dividing each feed rough region into a first sub-region and a second sub-region under each mutation point based on the position of each mutation point in each feed rough region; acquiring the initial homoplanar probability of each mutation point in each feed rough region according to the difference between the gray level distribution of the pixel points in the first sub-region and the second sub-region of each feed rough region under the same mutation point;
Acquiring the final co-planar probability of each feed rough region according to the initial co-planar probability of the mutation point in each feed rough region;
combining the shape of each feed rough region with the position distribution between the edge pixel points on the edge and the final homofacial probability to obtain the main facial probability of each feed rough region;
Acquiring the edge responsivity of each pixel point in the feed gray level image by utilizing a hessian matrix method; adjusting the edge responsivity based on the main body surface probability of the rough feed region where the pixel points in the feed gray level image are positioned, and obtaining the edge of the feed main body in the feed gray level image;
carrying out fine detection on the production quality of the feed according to the length of the feed edge in the feed gray level image;
for each feed rough region, acquiring gradient values of all pixel points in the feed rough region, and taking the pixel point corresponding to the maximum gradient value as a mutation point of the feed rough region;
The calculation formula of the final coplanarity probability of each feed rough region is as follows:
; wherein W is the final homofacial probability of each feed rough region; j is the total number of mutation points in each coarse feed region; /(I) The initial homoplanar probability of the jth mutation point in each feed rough region; /(I)For the difference between the maximum value of the initial coplanarity probabilities of all the mutant points and the initial coplanarity probability of the jth mutant point in each coarse feed region; /(I)To be in each feed rough region, the difference between the initial co-planar probability of the jth mutation point and the minimum value in the initial co-planar probability of all mutation points;
The method for obtaining the edge of the feed main body in the feed gray level image based on the main body surface probability of the rough feed region where the pixel points in the feed gray level image are located adjusts the edge responsiveness, and comprises the following steps:
Taking the product of the main surface probability of the rough feed region where each pixel point in the feed gray image is located and the edge responsivity as the edge adjustment responsivity of each pixel point in the feed gray image;
Taking the pixel point with the edge adjustment responsivity larger than a preset standard threshold value as a main body edge pixel point; and taking the connected domain formed by the pixel points at the edge of the main body as the edge of the main body of the feed.
2. The method for detecting the fine quality of feed production based on image feature analysis according to claim 1, wherein the method for dividing the feed gray level image into different feed rough areas comprises the following steps:
And acquiring a gradient image of the feed gray level image, dividing the gradient image by using a watershed algorithm, and taking corresponding areas of different obtained areas in the feed gray level image as rough feed areas.
3. The method for detecting the quality of feed production refinement based on image feature analysis according to claim 1, wherein the method for dividing each feed rough region into a first subregion and a second subregion under each mutation point based on the position of each mutation point in each feed rough region comprises the following steps:
for each rough feed area, for each mutation point of the rough feed area, acquiring a target straight line passing through the mutation point, wherein the direction of the target straight line is the same as the preset direction; if the abrupt change point belongs to the edge pixel point on the edge of the feed rough area, taking a line segment between the edge pixel point corresponding to the intersection point of the target straight line and the edge of the feed rough area and the abrupt change point as a demarcation line segment corresponding to the abrupt change point; if the abrupt point does not belong to the edge pixel point on the edge of the feed rough area, taking a line segment between the target straight line and the edge pixel points corresponding to two intersection points of the edge of the feed rough area as a demarcation line segment corresponding to the abrupt point;
dividing the feed rough region into a first sub-region and a second sub-region under each mutation point based on the demarcation line segment of each mutation point in the feed rough region.
4. The method for detecting the quality refinement of feed production based on image feature analysis according to claim 1, wherein the method for acquiring the initial homography possibility of each mutation point in each feed rough region according to the difference between gray level distribution of the pixel points in the first sub region and the second sub region of each feed rough region under the same mutation point comprises the following steps:
Taking a first subarea and a second subarea of each feed rough area under each mutation point as analysis subareas;
For each analysis subarea, taking the average value of the gray values of all pixel points in the analysis subarea as the gray comprehensive value of the analysis subarea;
Taking the absolute value of the difference between the gray value of each pixel point in the analysis subarea and the gray comprehensive value as the gray discrete value of each pixel point in the analysis subarea; numbering the pixel points in the analysis subarea, and taking the absolute value of the difference value between the gray values of the pixel points of the next number of the corresponding number of each pixel point in the analysis subarea as the gray difference value of each pixel point in the analysis subarea;
acquiring a comprehensive change value of each pixel point in the analysis subarea according to the gray discrete value and the gray difference value; the gray discrete value and the gray difference value are in positive correlation with the comprehensive change value;
And combining the difference between the gray scale integrated values of the first subarea and the second subarea of each feed rough area under each mutation point and the difference between the integrated change values to obtain the initial coplanarity probability of each mutation point in each feed rough area.
5. The method for detecting the quality of feed production based on image feature analysis according to claim 4, wherein the calculation formula of the initial co-planar probability of each mutation point in each feed rough region is as follows:
; in the/> The initial homoplanar probability of the jth mutation point in each feed rough region; /(I)The gray scale integrated value of the first subarea of each feed rough area under the j-th mutation point; /(I)The gray scale integrated value of the second subarea of each feed rough area under the j-th mutation point; m1 is the total number of pixel points in the first sub-area of each feed rough area under the j-th mutation point; The comprehensive change value of the (m 1) th pixel point in the first sub-area under the j-th mutation point in each feed rough area; /(I) The total number of pixel points in the second sub-area under the j-th mutation point in each feed rough area; The comprehensive change value of the m < 2 > pixel point in the second sub-area under the j-th mutation point in each feed rough area; /(I) As an absolute function; exp is an exponential function based on a natural constant e.
6. The method for detecting the quality of feed production refinement based on image feature analysis according to claim 1, wherein the method for acquiring the main surface likelihood of each feed rough region by combining the shape of each feed rough region and the position distribution between edge pixels on the edge and the final co-planar likelihood comprises the following steps:
For each feed rough region, selecting any one edge pixel point on the edge of the feed rough region as an analysis pixel point, calculating Euclidean distances between the analysis pixel point and each of the rest edge pixel points except the analysis pixel point on the edge of the feed rough region, and taking the smallest Euclidean distance as a length value of the analysis pixel point;
Taking the variance of the length value of the edge pixel point on the edge of the feed rough region as the rectangular feature degree of the feed rough region;
For each edge pixel point on the edge of the feed rough area, marking the pixel point in a preset neighborhood range of the edge pixel point to obtain a marking value of each pixel point in the preset neighborhood range of the edge pixel point; on the edge of the feed rough area, taking the next adjacent edge pixel point of each edge pixel point as a judging edge pixel point of each edge pixel point, and taking the label value of the judging edge pixel point in the preset neighborhood range of each edge pixel point as the relative position value of each edge pixel point on the edge of the feed rough area;
and combining the difference between the relative position values of the adjacent edge pixel points on the edge of each feed rough area, the rectangular feature degree and the final co-planar probability degree to obtain the main body co-planar probability degree of each feed rough area.
7. The method for detecting the quality of feed production refinement based on image feature analysis according to claim 6, wherein the calculation formula of the main surface probability of each feed rough region is as follows:
; wherein R is the main body surface probability of each feed rough area; /(I) The rectangular feature degree of each feed rough area; b is the total number of edge pixel points on the edge of each feed rough area; /(I)The relative position value of the b-th edge pixel point on the edge of each feed rough area is calculated; /(I)The relative position value of the (b+1) th edge pixel point on the edge of each feed rough area; w is the final homotopy probability of each feed rough region; /(I)Is a normalization function; norm is the normalization function; exp is an exponential function based on a natural constant e.
8. The method for detecting the quality of feed production based on image feature analysis according to claim 1, wherein the method for detecting the quality of feed production based on the length of the feed edge in the feed gray level image comprises the following steps:
Acquiring corner points on the edge of each feed main body in the feed gray level image; dividing the feed body edge into different sub-body edges for each feed body edge based on all corner points; counting the total number of pixel points on the edge of the sub-main body as the edge length of the edge of the sub-main body;
Taking the maximum value of the edge length of the sub-main body edge corresponding to the feed main body edge as the initial reference length of the feed main body edge; normalizing all the initial reference lengths to obtain the final reference length of the edge of the feed main body;
if the final reference length is within the preset standard length interval, the corresponding feed main body edge is taken as the feed standard main body edge;
Taking the ratio of the total number of the standard main body edges of the feed to the total number of the main body edges of the feed in the feed gray level image as a fine judgment value; when the fine judgment value is larger than or equal to a preset fine threshold value, the feed to be tested accords with the fine standard; and when the fine judgment value is smaller than a preset fine threshold value, the feed to be tested does not accord with the fine standard.
CN202410224600.XA 2024-02-29 2024-02-29 Feed production quality refinement detection method based on image feature analysis Active CN117808806B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410224600.XA CN117808806B (en) 2024-02-29 2024-02-29 Feed production quality refinement detection method based on image feature analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410224600.XA CN117808806B (en) 2024-02-29 2024-02-29 Feed production quality refinement detection method based on image feature analysis

Publications (2)

Publication Number Publication Date
CN117808806A CN117808806A (en) 2024-04-02
CN117808806B true CN117808806B (en) 2024-05-03

Family

ID=90428017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410224600.XA Active CN117808806B (en) 2024-02-29 2024-02-29 Feed production quality refinement detection method based on image feature analysis

Country Status (1)

Country Link
CN (1) CN117808806B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070071733A (en) * 2005-12-30 2007-07-04 한이진 A quality monitoring system of butchered chicken line and method thereof
CN103202407A (en) * 2013-05-03 2013-07-17 新疆泰昆集团股份有限公司 Puffed aquatic feed and preparation method thereof
RU2013104418A (en) * 2013-02-01 2014-08-10 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Поволжский государственный технологический университет" METHOD FOR ANALYSIS OF THE SPECIAL COMPOSITION OF MEADOW GRASS ON DYNAMICS OF MASS OF SAMPLE PARTS
CN106370667A (en) * 2016-07-28 2017-02-01 广东技术师范学院 Visual detection apparatus and method for quality of corn kernel
CN114581376A (en) * 2022-01-31 2022-06-03 南通摩瑞纺织有限公司 Automatic sorting method and system for textile silkworm cocoons based on image recognition
CN116448686A (en) * 2023-06-15 2023-07-18 广东省农业科学院动物科学研究所 Anti-counterfeiting detection method for fragile fish feed
CN116797598A (en) * 2023-08-22 2023-09-22 山东万牧农业科技有限公司郯城分公司 Image feature-based cultivation feed quality refinement detection method
CN116934740A (en) * 2023-09-11 2023-10-24 深圳市伟利达精密塑胶模具有限公司 Plastic mold surface defect analysis and detection method based on image processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070071733A (en) * 2005-12-30 2007-07-04 한이진 A quality monitoring system of butchered chicken line and method thereof
RU2013104418A (en) * 2013-02-01 2014-08-10 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Поволжский государственный технологический университет" METHOD FOR ANALYSIS OF THE SPECIAL COMPOSITION OF MEADOW GRASS ON DYNAMICS OF MASS OF SAMPLE PARTS
CN103202407A (en) * 2013-05-03 2013-07-17 新疆泰昆集团股份有限公司 Puffed aquatic feed and preparation method thereof
CN106370667A (en) * 2016-07-28 2017-02-01 广东技术师范学院 Visual detection apparatus and method for quality of corn kernel
CN114581376A (en) * 2022-01-31 2022-06-03 南通摩瑞纺织有限公司 Automatic sorting method and system for textile silkworm cocoons based on image recognition
CN116448686A (en) * 2023-06-15 2023-07-18 广东省农业科学院动物科学研究所 Anti-counterfeiting detection method for fragile fish feed
CN116797598A (en) * 2023-08-22 2023-09-22 山东万牧农业科技有限公司郯城分公司 Image feature-based cultivation feed quality refinement detection method
CN116934740A (en) * 2023-09-11 2023-10-24 深圳市伟利达精密塑胶模具有限公司 Plastic mold surface defect analysis and detection method based on image processing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Evolutionary refinement approaches for band selection of hyperspectral images with applications to automatic monitoring of animal feed quality;Philip Wilcox et al.;《Intelligent Data Analysis》;20141231;全文 *

Also Published As

Publication number Publication date
CN117808806A (en) 2024-04-02

Similar Documents

Publication Publication Date Title
CN104794491B (en) Based on the fuzzy clustering Surface Defects in Steel Plate detection method presorted
US5612928A (en) Method and apparatus for classifying objects in sonar images
CN115311292A (en) Strip steel surface defect detection method and system based on image processing
CN115018828A (en) Defect detection method for electronic component
CN116758083B (en) Quick detection method for metal wash basin defects based on computer vision
CN113724231B (en) Industrial defect detection method based on semantic segmentation and target detection fusion model
CN106326834B (en) method and device for automatically identifying sex of human body
CN111476804B (en) Efficient carrier roller image segmentation method, device, equipment and storage medium
CN114723705A (en) Cloth flaw detection method based on image processing
CN108898132A (en) A kind of terahertz image dangerous material recognition methods based on Shape context description
CN116912248B (en) Irregular hardware surface defect detection method based on computer vision
CN116152242B (en) Visual detection system of natural leather defect for basketball
CN115049651B (en) Metal plate stamping abnormity detection method
CN116958125B (en) Electronic contest host power supply element defect visual detection method based on image processing
Davies Introduction to texture analysis
CN114897773A (en) Distorted wood detection method and system based on image processing
CN116168025B (en) Oil curtain type fried peanut production system
CN115880501A (en) High-voltage wire infrared image processing method and system based on infrared camera
CN117314901B (en) Scale-adaptive chip detection neural network system
CN107545565B (en) Solar screen plate detection method
CN117808806B (en) Feed production quality refinement detection method based on image feature analysis
CN116152255B (en) Modified plastic production defect judging method
CN115862006B (en) Bran star detection method in flour milling process
CN117274293A (en) Accurate bacterial colony dividing method based on image features
Fei et al. Change detection in remote sensing images of damage areas with complex terrain using texture information and SVM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant