CN114494099A - Method and device for detecting rice processing precision and storage medium - Google Patents

Method and device for detecting rice processing precision and storage medium Download PDF

Info

Publication number
CN114494099A
CN114494099A CN202011147481.0A CN202011147481A CN114494099A CN 114494099 A CN114494099 A CN 114494099A CN 202011147481 A CN202011147481 A CN 202011147481A CN 114494099 A CN114494099 A CN 114494099A
Authority
CN
China
Prior art keywords
image
rice
remaining
gray value
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011147481.0A
Other languages
Chinese (zh)
Inventor
朱邵成
范二荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Meyer Optoelectronic Technology Inc
Original Assignee
Hefei Meyer Optoelectronic Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Meyer Optoelectronic Technology Inc filed Critical Hefei Meyer Optoelectronic Technology Inc
Priority to CN202011147481.0A priority Critical patent/CN114494099A/en
Publication of CN114494099A publication Critical patent/CN114494099A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/85Investigating moving fluids or granular solids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/85Investigating moving fluids or granular solids
    • G01N2021/8592Grain or other flowing solid samples
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention discloses a method and equipment for detecting rice processing precision and a storage medium. The detection method for the rice processing precision comprises the following steps: collecting images of rice grains at least three angles, and recording the images at the at least three angles as a first image; preprocessing the first image to obtain a second image; and calculating the processing precision of the rice grains according to the first image and the second image. The rice processing precision detection method detects the surface characteristics of rice grains based on the multi-angle images of the rice grains, determines the rice processing precision, has high detection accuracy, can reduce the labor cost, and improves the production efficiency of a rice production line.

Description

Method and device for detecting rice processing precision and storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to a method and equipment for detecting rice processing precision and a storage medium.
Background
With the improvement of living standard of people, people have higher requirements on the nutritive value and the appearance of the edible rice. The State administration of market supervision and management and the China Committee of standardization administration issue a new rice standard GB/T1354-. Wherein, the germ part can reflect the nutritive value degree of the rice and further reflect the quality of the rice.
In the related technology, a dyeing method is usually adopted to detect the skin remaining degree of the rice, the method needs to prepare a chemical dyeing agent in advance to soak a sample, however, the skin layer of the sample is easy to drop in the soaking process, the skin remaining degree calculation accuracy is low, and the workload of workers is large. Meanwhile, both germs on rice grains and residual rice husks are stained by a staining method, so that accurate identification of germ parts is difficult.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art. Therefore, the first objective of the present invention is to provide a method for detecting rice processing accuracy, so as to classify the types of rice grains according to specific rice production processing tasks, thereby determining the processing accuracy of rice.
A second object of the invention is to propose a computer-readable storage medium.
The third purpose of the invention is to provide a detection device for the processing precision of rice.
In order to achieve the above object, an embodiment of the first aspect of the present invention provides a method for detecting processing accuracy of rice, the method comprising the following steps: acquiring images of rice grains at least three angles through at least three image acquisition devices, and recording the images at the at least three angles as a first image; preprocessing the first image to obtain a second image; and calculating the processing precision of the rice grains according to the first image and the second image.
According to the detection method for the rice processing precision, the images of rice grains at least three angles are collected through at least three image collecting devices, and the images at the three angles are recorded as a first image; and preprocessing the first image to obtain a second image, and calculating the processing precision of the rice grains according to the first image and the second image. Therefore, the detection accuracy can be improved, the labor cost can be reduced, and the production efficiency of the rice production line is improved.
In addition, the method for detecting the processing accuracy of the rice of the invention can also have the following additional technical characteristics:
according to one embodiment of the invention, the calculating the processing accuracy of the rice grains from the first image and the second image comprises: determining whether the rice grains are embryo-remaining and skin-remaining rice according to the first image and the second image; if the rice grains are embryo-remaining and skin-remaining rice, calculating embryo-remaining parameters and skin-remaining parameters of the rice grains; and calculating the processing precision of the rice grains according to the embryo retention parameters and the skin retention parameters.
According to an embodiment of the invention, the pre-processing the first image comprises: and carrying out binarization processing on the first image to obtain a second image, wherein the gray value of a pixel point in the second image is a first gray value or a second gray value.
According to one embodiment of the invention, the determining whether the rice grains are embryo-sparing rice according to the first image and the second image comprises: obtaining a blue value B and a red value R of each pixel point in the first image; setting the gray value of the pixel point (i, j) satisfying the formula 100 x (R-B)/R > nRB as the first gray value to obtain a third image, wherein nRB is a second preset threshold; dividing a preset region by taking each pixel point with the gray value in the third image as the first gray value as a center; counting a first number of pixel points with gray values of the first gray value in each preset area, and setting the gray value of the central pixel corresponding to the preset area as a second gray value when the first number is smaller than the first preset number to obtain a fourth image; acquiring a connected region with the largest area in the connected region with the gray value as the first gray value in the fourth image; and determining the rice grains as germ-remaining rice according to the communication area with the largest area and the second image.
According to an embodiment of the invention, the determining the rice grains as germ-remaining rice according to the largest area of the connected region and the second image comprises: calculating the compactness of the communication region with the largest area according to the following formula:
fcompactness=nEdgeArea*nEdgeArea/nMaxArea,
wherein, fcompactness is the compactness, nEdgeArea is the outline perimeter of the communication area with the largest area, and nMaxOrea is the area of the communication area with the largest area; if the compactness is larger than a second preset threshold value, acquiring a first position of the centroid of the communicated region with the largest area, and acquiring a second position of the centroid of the rice grain where the communicated region with the largest area is located according to the second image; calculating a distance between the first location and the second location; and if the distance is greater than or equal to a third preset threshold value, determining the rice grains as germ-remaining rice.
According to one embodiment of the invention, the determining whether the rice grains are embryo-sparing rice according to the first image and the second image comprises: obtaining a blue value B of each pixel point in the first image; calculating the gradient values of the pixel points (i, j) in a plurality of directions around the pixel points (i, j) according to the following formula:
nT1(k)=smooth(i'(k),j'(k))-smooth(i,j),
k is greater than or equal to 4 and less than or equal to K, K and K are integers, K represents the number of directions, the gray value of a pixel point (i, j) in the second image is the first gray value, the blue value in the first image is less than or equal to a fourth preset threshold, nT1(K) is the gradient value of the pixel point (i, j) in the first image in the kth direction, smooth (i, j) is the blue value of the pixel point (i, j) in the first image, smooth (i '(K), j' (K)) is the blue value of the pixel point (i '(K), j' (K)) in the first image, and the distance between the pixel point (i '(K), j' (K)) and the pixel point (i, j) is a preset step length; when the K gradient values are all larger than a fifth preset threshold value, reserving the gray value of the pixel point (i, j) in the second image as the first gray value, otherwise, setting the gray value of the pixel point (i, j) in the second image as the second gray value to obtain a fifth image; and determining whether the rice grains are embryo-remaining and skin-remaining rice according to the fifth image.
According to an embodiment of the invention, said determining from said fifth image whether said rice grains are embryo sparing rice comprises: acquiring a connected region with an area smaller than a sixth preset threshold and larger than a seventh preset threshold in the fifth image, and deleting the connected region to obtain a sixth image, wherein the sixth preset threshold is smaller than the seventh preset threshold; setting the gray value of the sixth image as the first gray value, and setting the gray value of the pixel point of which the corresponding blue value is greater than a ninth preset threshold value in the first image as the second gray value to obtain a seventh image; counting the length of the diagonal line of the minimum circumscribed rectangle of all the connected regions with the gray values of the first gray value in the seventh image; and if the diagonal length greater than the ninth preset threshold value exists, determining the rice grains as the lienfer rice.
According to an embodiment of the invention, the method for detecting the processing accuracy of the rice further comprises the following steps: the embryo remaining parameter is the embryo remaining area of the rice embryo remaining area, the skin remaining parameter is the skin remaining area of the rice skin remaining area, the embryo remaining area is the area of the communication area with the largest area in the communication area with the gray value of the first gray value in the fourth image, and the skin remaining area is the area of the communication area with all the gray values of the first gray value in the seventh image.
According to an embodiment of the invention, the method for detecting the processing accuracy of the rice further comprises the following steps: calculating the processing accuracy of the rice grains by the following formula:
Figure BDA0002740130310000031
wherein nRate is the processing precision, Σ nmembroyarea is the embryo remaining area, Σ nhintineaea is the skin remaining area, and Σ nrearea is the rice grain area.
In order to achieve the above object, a second aspect of the present invention provides a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the above method for detecting processing accuracy of rice.
The computer-readable storage medium of the embodiment of the invention can improve the detection accuracy of the rice processing precision, reduce the labor cost and improve the production efficiency of the rice production line when the computer program stored on the computer-readable storage medium is executed by the processor.
In order to achieve the above object, a third aspect of the present invention provides a device for detecting rice processing accuracy, which comprises a memory, a processor and a computer program stored in the memory, wherein when the computer program is executed by the processor, the method for detecting rice processing accuracy is realized.
According to the detection device for the rice processing precision, disclosed by the embodiment of the invention, the detection accuracy of the rice processing precision can be improved, the labor cost is reduced, and the production efficiency of a rice production line is improved by realizing the detection method for the rice processing precision.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
Fig. 1 is a flowchart of a method for detecting processing accuracy of rice according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for detecting the processing accuracy of rice according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method for detecting the processing accuracy of rice according to another embodiment of the present invention;
fig. 4 is a schematic view of a rice processing accuracy detecting apparatus according to an example of the present invention;
FIG. 5 is a schematic view of a first exemplary rice grain of the present invention;
FIG. 6 is a schematic illustration of a second exemplary rice grain of the present invention;
FIG. 7 is a schematic illustration of a third exemplary rice grain of the present invention;
FIG. 8 is a schematic illustration of a fourth exemplary rice grain of the present invention;
FIG. 9 is a schematic view of a rice processing accuracy test page of one example of the present invention;
fig. 10 is a schematic view of a rice processing accuracy test page according to another example of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A method, an apparatus, and a storage medium for detecting rice processing accuracy according to an embodiment of the present invention will be described with reference to fig. 1 to 10.
Fig. 1 is a flowchart of a method for detecting rice processing accuracy according to an embodiment of the present invention.
And S11, acquiring images of the rice grains at least three angles, and recording the images at least three angles as a first image.
Wherein the rice grains are processed rice grains, and the first image may be an RGB image.
Specifically, images of rice grains at least three angles may be acquired by a plurality of image acquiring devices, as shown in fig. 4, the number of the image acquiring devices may be 3, which are respectively denoted as image acquiring devices 1, 2, and 3. The image acquisition devices 1, 2 and 3 simultaneously receive R, G, B signals generated by the white light LED light sources 4, 5 and 6 acting on the surfaces of rice grains. For obtaining above-mentioned RGB image, the image of grain of rice multi-angle is gathered simultaneously to image acquisition device 1, 2, 3, and then utilizes the image realization of a plurality of angles to the discernment of grain of rice, realizes the discernment to the rice embryo to and stay embryo degree and the calculation of staying skin degree.
Specifically, referring to fig. 4, the three image capturing devices may be located on the same plane at the same included angle (i.e., two included angles are 120 °), and simultaneously capture the omni-directional appearance characteristics of the same rice grain. The three groups of white light LED light sources are distributed on the upper and lower parts of the same side of the three image acquisition devices at the same included angle, and the image acquisition devices can simultaneously receive R, G, B value signals in the white light.
And S12, preprocessing the first image to obtain a second image.
As one example, preprocessing the first image may include: and carrying out binarization processing on the first image to obtain a second image, wherein the gray value of a pixel point in the second image is a first gray value or a second gray value. The first gray scale value may be a gray scale value of 255, and the second gray scale value may be a gray scale value of 0.
For example, referring to fig. 5, a first image is shown in (a) of fig. 5, and a second image obtained by the above-described method of preprocessing the first image may be shown in (b) of fig. 5.
It should be noted that, due to the black background, there is a possibility that the rice grain binary image in the second image has an internal void, as shown in fig. 6 (a). Therefore, after the first image is preprocessed to obtain the second image, it is necessary to perform edge detection on the second image, detect the edges of the rice grain binary image, and fill the inside of the rice grain binary image to obtain the image shown in fig. 6 (b).
Optionally, a morphological erosion operation can be further adopted to erode the image edge of the second image by two circles, so that the influence of color distortion of the edge pixel on the identification is prevented.
And S13, determining whether the rice grains are embryo-remaining and skin-remaining rice according to the first image and the second image.
And S14, if the rice grains are the germ-remaining and skin-remaining rice, calculating germ-remaining parameters and skin-remaining parameters of the rice grains.
Wherein, the embryo remaining parameter can be the embryo remaining area of the rice embryo remaining area, and the skin remaining parameter can be the skin remaining area of the rice skin remaining area.
Specifically, if the rice grains are determined to be embryo-remaining and husk-remaining rice, calculating the embryo-remaining parameters, the husk-remaining parameters and the areas of the rice grains.
And S15, calculating the processing precision of the rice grains according to the germ retention parameters and the skin retention parameters.
Specifically, the processing accuracy of rice grains can be calculated according to the following formula:
Figure BDA0002740130310000051
where nRate is the processing precision, Σ nmembroroarea is the embryo remaining area, Σ nth in area is the skin remaining area, Σ nrarea is the area of rice grain, and Σ nrarea may be the area of the connected region in the second image (i.e., the connected region of the pixel point whose grayscale value is the first grayscale value).
Further, rice grains may be classified according to processing accuracy into fine grinding and proper grinding, or into super fine grinding, fine grinding and proper grinding. Specifically, a processing precision interval is correspondingly set in advance for each type, and the type of rice grains can be determined according to the calculated interval where the processing precision is located.
In some examples, as shown in fig. 2, determining from the first image and the second image whether the rice grains are embryo sparing rice may comprise:
and S31, obtaining the blue value B and the red value R of each pixel point in the first image.
And S32, setting the gray value of the pixel point (i, j) satisfying the formula 100 x (R-B)/R > nRB as a first gray value to obtain a third image, wherein nRB is a first preset threshold.
The first preset threshold may be 10 to 35, for example, 13.5.
And S33, performing preset region division by taking each pixel point with the gray value in the third image as the first gray value as the center.
In this example, the preset area may be a frame with M rows and N columns. For example, it may be a 5-row 7-column box.
And S34, counting a first number of pixel points with gray values of the first gray values in each preset area, and setting the gray value of the central pixel corresponding to the preset area as a second gray value when the first number is smaller than the first preset number to obtain a fourth image.
Specifically, when the first number is greater than or equal to the first preset number, the gray value of the center point pixel corresponding to the preset area may be set as the first gray value, otherwise, the gray value of the center point pixel may be set as the second gray value, so as to obtain the fourth image.
The first preset number can be 6-10, for example, 8; the embryo remaining area may be an area of a connected region having a largest area among connected regions having a gray value of the first gray value in the fourth image.
And S35, acquiring the connected region with the largest area in the connected regions with the gray values as the first gray values in the fourth image.
And S36, determining the rice grains as germ-remaining rice according to the communication area with the largest area and the second image.
Specifically, the connected region with the largest area in the fourth image is obtained, and the pixels of the other connected regions are set as the second gray scale values, so that the compactness of the connected region with the largest area can be calculated according to the following formula:
fcompactness=nEdgeArea*nEdgeArea/nMaxArea
wherein fcompactness is compactness, noedgearea is a contour perimeter of the communicating region, and nMaxArea is an area of the communicating region having the largest area.
Further, if the compactness is greater than a second preset threshold value, acquiring a first position of the centroid of the communicated region with the largest area, and acquiring a second position of the centroid of the rice grain where the communicated region with the largest area is located according to the second image; a distance between the first location and the second location is calculated. And if the distance is greater than or equal to a fourth preset threshold value, determining the rice grains as germ-remaining rice.
The second preset threshold may be 27 to 33, for example, 30. The third preset threshold may be a value between 8 and 12, and may be 10, for example.
In this example, referring to fig. 7, (a) in fig. 7 shows a first image of the stay embryo rice, and (b) in fig. 7 shows a fourth image of the stay embryo rice.
In some examples, as shown in fig. 3, determining from the first image and the second image whether the rice grains are embryo sparing rice may comprise:
and S41, obtaining the blue value B of each pixel point in the first image.
S42, calculating gradient values of the pixel points (i, j) in a plurality of directions around the pixel points (i, j).
Specifically, the gradient values of the pixel points (i, j) in a plurality of directions around the pixel points (i, j) can be calculated according to the following formula:
nT1(k)=smooth(i'(k),j'(k))-smooth(i,j),
k is greater than or equal to 4 and less than or equal to K, K and K are integers, K represents the number of directions, the gray value of a pixel point (i, j) in the second image is a first gray value, the blue value in the first image is less than or equal to a fourth preset threshold, nT1(K) is the gradient value of the pixel point (i, j) in the first image in the K-th direction, smooth (i, j) is the blue value of the pixel point (i, j) in the first image, smooth (i '(K), j' (K)) is the blue value of the pixel point (i '(K), j' (K)) in the first image, and the distance between the pixel point (i '(K), j' (K)) and the pixel point (i, j) is a preset step length. The fourth preset threshold may be 175 to 185, for example, 180; the preset step length can be between 8 and 12, and can be 10, for example.
In this example, K may have a value of 8, and may be 8 directions in total, for example, in four directions of up, down, left, and right, and a direction intermediate between adjacent directions of the four directions, that is, a direction at an angle of 45 degrees to the adjacent directions.
And S43, when the K gradient values are all larger than a fifth preset threshold value, keeping the gray value of the pixel point (i, j) in the second image as a first gray value, otherwise, setting the gray value of the pixel point (i, j) in the second image as a second gray value to obtain a fifth image.
The fifth preset threshold may be 23 to 27, for example, 25.
And S44, determining whether the rice grains are embryo-remaining and skin-remaining rice according to the fifth image.
Specifically, connected areas with areas smaller than a sixth preset threshold and larger than a seventh preset threshold in the fifth image are obtained and deleted, and the sixth image is obtained, wherein the sixth preset threshold is smaller than the seventh preset threshold. And setting the gray value of the sixth image as the first gray value and the gray value of the pixel point of which the corresponding blue value is greater than the ninth preset threshold value in the first image as the second gray value to obtain a seventh image.
Specifically, the false identification region can be deleted by setting the gray value of the pixel point in the first image, corresponding to which the blue value is greater than the ninth preset threshold, as the second gray value.
Further, the diagonal length of the minimum bounding rectangle of all connected regions with the gray values of the first gray value in the seventh image is counted. If there is a diagonal length greater than a ninth pre-set threshold, determining the rice grain as being lienfmi.
The sixth preset threshold may be 75-85, for example, 80. The seventh preset threshold may be 500 to 520, for example 510. The ninth preset threshold may be between 80 and 90, and may be 85, for example. The ninth preset threshold may be 27 to 33, for example, 30. The bark-remaining area may be an area of a connected region with a gray value of the first gray value in the seventh image (i.e., a connected region with a gray value of 255), and of course, the bark-remaining area may also be an area of all connected regions with a diagonal length greater than a ninth preset threshold in the seventh image
In this example, referring to fig. 8, (a) in fig. 8 shows a first image of the stay-in-picometre, and (b) in fig. 8 shows a seventh image of the stay-in-picometre.
It should be noted that, because the methods for judging the germ-remaining rice and the pericarp-remaining rice are different, the two judging processes can be performed simultaneously; or judging the germ-remaining rice and then judging the skin-remaining rice; the judgment of the preserved rice can be carried out firstly, and then the judgment of the preserved rice can be carried out. And calculating the embryo remaining parameter and the skin remaining parameter in the corresponding judgment processes, and calculating the area of the rice grains directly based on the second image.
In summary, the method for detecting the rice processing precision of the embodiment of the invention detects the surface characteristics of the rice grains based on the multi-angle images of the rice grains, determines the rice processing precision, has high detection accuracy, does not need to dye the rice grains, can reduce the labor cost, and improves the production efficiency of the rice production line.
Based on the above identification, the following data are obtained as shown in table 1:
TABLE 1
Rice sample Total number of Number of identifications Recognition rate
Germ-remaining rice 531 520 97.9%
Therefore, the invention can obtain relatively good identification effect, and the identification rate of the germ-remaining rice can reach 97.9%.
The method is characterized in that a refined rice sample with a skin retention degree of 0.7% and a proper rice sample with a skin retention degree of 3.2% in the national standard of 2020 is adopted for testing, the refined rice sample and the proper rice sample are both provided by the official part and are provided by a grain quality detection center specifically, the rice sample is an international rice sample manufactured by Yuehui grain and oil Co., Ltd, in Suzhou, and is suitable for calibrating the rice processing precision, the grade identification and the quality control of rice in the rice processing production process, the skin retention degree is the processing precision in the invention, and the test results are shown in the following table 2:
TABLE 2
Rice sample National standard skin degree (%) This example measures the degree of skin retention (%)
Fine grinding machine 0.7 0.73
Suitable for grinding 3.2 3.26
In the specific test, the generated test interface is shown in fig. 9 and 10, referring to fig. 9, for the polished rice sample provided by the official, the skin retention degree of the test is 0.73%, and the polished rice is judged; referring to fig. 10, for the rice sample suitable for milling provided by the official, the peel remaining degree of the test is 3.26%, and the rice sample is judged to be suitable for milling, so that the calculation accuracy of the processing accuracy is high.
Further, the present invention proposes a computer-readable storage medium.
In the embodiment of the present invention, a computer program is stored on a computer-readable storage medium, and when the computer program is executed by a processor, the method for detecting the processing accuracy of rice is realized.
The computer-readable storage medium of the embodiment of the invention can improve the detection accuracy of the rice processing precision, reduce the labor cost and improve the production efficiency of the rice production line when the computer program stored on the computer-readable storage medium is executed by the processor.
Further, the invention provides a detection device for rice processing precision.
In an embodiment of the present invention, the apparatus for detecting the processing accuracy of rice includes a memory, a processor, and a computer program stored in the memory, and when the computer program is executed by the processor, the method for detecting the processing accuracy of rice is implemented.
According to the detection device for the rice processing precision, disclosed by the embodiment of the invention, the detection accuracy of the rice processing precision can be improved, the labor cost can be reduced, and the production efficiency of a rice production line can be improved by realizing the detection method for the rice processing precision.
It should be noted that the logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be considered limiting of the invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. A detection method for rice processing precision is characterized by comprising the following steps:
collecting images of rice grains at least three angles, and recording the images at the at least three angles as a first image;
preprocessing the first image to obtain a second image;
determining whether the rice grains are embryo-remaining and skin-remaining rice according to the first image and the second image;
if the rice grains are embryo-remaining and skin-remaining rice, calculating embryo-remaining parameters and skin-remaining parameters of the rice grains;
and calculating the processing precision of the rice grains according to the embryo retention parameters and the skin retention parameters.
2. The method for detecting rice processing accuracy according to claim 1, wherein the first image is an RGB image, and the preprocessing the first image includes:
and carrying out binarization processing on the first image to obtain a second image, wherein the gray value of a pixel point in the second image is a first gray value or a second gray value.
3. The method for detecting rice processing accuracy according to claim 2, wherein the determining whether the rice grains are embryo-remaining and husk-remaining rice based on the first image and the second image comprises:
obtaining a blue value B and a red value R of each pixel point in the first image;
setting the gray value of the pixel point (i, j) satisfying the formula 100 x (R-B)/R > nRB as the first gray value to obtain a third image, wherein nRB is a first preset threshold;
dividing a preset region by taking each pixel point with the gray value in the third image as the first gray value as a center;
counting a first number of pixel points with gray values of the first gray value in each preset area, and setting the gray value of the central pixel corresponding to the preset area as a second gray value when the first number is smaller than the first preset number to obtain a fourth image;
acquiring a connected region with the largest area in the connected region with the gray value as the first gray value in the fourth image;
and determining the rice grains as germ-remaining rice according to the communication area with the largest area and the second image.
4. The method for detecting rice processing accuracy according to claim 3, wherein the step of determining that the rice grains are embryo-left rice based on the connected region having the largest area and the second image comprises:
calculating the compactness of the communication region with the largest area according to the following formula:
fcompactness=nEdgeArea*nEdgeArea/nMaxArea,
wherein, fcompactness is the compactness, nEdgeArea is the outline perimeter of the communication area with the largest area, and nMaxOrea is the area of the communication area with the largest area;
if the compactness is larger than a second preset threshold value, acquiring a first position of the centroid of the communicated region with the largest area, and acquiring a second position of the centroid of the rice grain where the communicated region with the largest area is located according to the second image;
calculating a distance between the first location and the second location;
and if the distance is greater than or equal to a third preset threshold value, determining the rice grains as germ-remaining rice.
5. The method for detecting rice processing precision according to claim 3 or 4, wherein the determining whether the rice grains are embryo-remaining and skin-remaining rice according to the first image and the second image comprises:
obtaining a blue value B of each pixel point in the first image;
calculating the gradient values of the pixel points (i, j) in a plurality of directions around the pixel points (i, j) according to the following formula:
nT1(k)=smooth(i'(k),j'(k))-smooth(i,j),
k is greater than or equal to 4 and less than or equal to K, K and K are integers, K represents the number of directions, the gray value of a pixel point (i, j) in the second image is the first gray value, the blue value in the first image is less than or equal to a fourth preset threshold, nT1(K) is the gradient value of the pixel point (i, j) in the first image in the kth direction, smooth (i, j) is the blue value of the pixel point (i, j) in the first image, smooth (i '(K), j' (K)) is the blue value of the pixel point (i '(K), j' (K)) in the first image, and the distance between the pixel point (i '(K), j' (K)) and the pixel point (i, j) is a preset step length;
when the K gradient values are all larger than a fifth preset threshold value, keeping the gray value of the pixel point (i, j) in the second image as the first gray value, otherwise, setting the gray value of the pixel point (i, j) in the second image as the second gray value to obtain a fifth image;
and determining whether the rice grains are embryo-remaining and skin-remaining rice according to the fifth image.
6. The method for detecting rice processing accuracy according to claim 5, wherein the determining whether the rice grains are embryo-remaining and skin-remaining rice according to the fifth image comprises:
acquiring a connected region with an area smaller than a sixth preset threshold and larger than a seventh preset threshold in the fifth image, and deleting the connected region to obtain a sixth image, wherein the sixth preset threshold is smaller than the seventh preset threshold;
setting the gray value of the sixth image as the first gray value, and setting the gray value of the pixel point of which the corresponding blue value is greater than a ninth preset threshold value in the first image as the second gray value to obtain a seventh image;
counting the length of the diagonal line of the minimum circumscribed rectangle of all the connected regions with the gray values of the first gray value in the seventh image;
and if the diagonal length greater than the ninth preset threshold value exists, determining the rice grains as the lienfer rice.
7. The method for detecting rice processing accuracy according to claim 6, wherein the embryo remaining parameter is an embryo remaining area of the rice embryo remaining region, and the husk remaining parameter is an husk remaining area of the rice husk remaining region, wherein the embryo remaining area is an area of a connected region having a largest area among connected regions having a gray value of a first gray value in the fourth image, and the husk remaining area is an area of all connected regions having a gray value of the first gray value in the seventh image.
8. The method for detecting rice processing accuracy according to claim 7, wherein the rice processing accuracy is calculated by the following formula:
Figure FDA0002740130300000031
wherein nRate is the processing precision, Σ nmembroyarea is the embryo remaining area, Σ nhintineaea is the skin remaining area, and Σ nrearea is the rice grain area.
9. A computer-readable storage medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method for detecting rice processing accuracy according to any one of claims 1 to 8.
10. A rice processing accuracy detecting apparatus comprising a memory, a processor, and a computer program stored on the memory, wherein the computer program, when executed by the processor, implements the rice processing accuracy detecting method according to any one of claims 1 to 8.
CN202011147481.0A 2020-10-23 2020-10-23 Method and device for detecting rice processing precision and storage medium Pending CN114494099A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011147481.0A CN114494099A (en) 2020-10-23 2020-10-23 Method and device for detecting rice processing precision and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011147481.0A CN114494099A (en) 2020-10-23 2020-10-23 Method and device for detecting rice processing precision and storage medium

Publications (1)

Publication Number Publication Date
CN114494099A true CN114494099A (en) 2022-05-13

Family

ID=81471210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011147481.0A Pending CN114494099A (en) 2020-10-23 2020-10-23 Method and device for detecting rice processing precision and storage medium

Country Status (1)

Country Link
CN (1) CN114494099A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116754548A (en) * 2022-07-12 2023-09-15 黑龙江省农业科学院食品加工研究所 Determination method for peel retention degree of processed rice
CN116754548B (en) * 2022-07-12 2024-05-24 黑龙江省农业科学院食品加工研究所 Determination method for peel retention degree of processed rice

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116754548A (en) * 2022-07-12 2023-09-15 黑龙江省农业科学院食品加工研究所 Determination method for peel retention degree of processed rice
CN116754548B (en) * 2022-07-12 2024-05-24 黑龙江省农业科学院食品加工研究所 Determination method for peel retention degree of processed rice

Similar Documents

Publication Publication Date Title
CN114723701B (en) Gear defect detection method and system based on computer vision
CN109447945B (en) Quick counting method for basic wheat seedlings based on machine vision and graphic processing
CN110120042B (en) Crop image pest and disease damage area extraction method based on SLIC super-pixel and automatic threshold segmentation
CN107909138A (en) A kind of class rounded grain thing method of counting based on Android platform
CN109472261B (en) Computer vision-based automatic monitoring method for grain storage quantity change of granary
CN115115612B (en) Surface defect detection method and system for mechanical parts
CN106067177B (en) HDR scene detection method and device
CN110210448B (en) Intelligent face skin aging degree identification and evaluation method
CN111753577A (en) Apple identification and positioning method in automatic picking robot
CN111259925B (en) K-means clustering and width mutation algorithm-based field wheat spike counting method
CN116721391B (en) Method for detecting separation effect of raw oil based on computer vision
CN110648330B (en) Defect detection method for camera glass
CN115205319B (en) Seed feature extraction and classification method used in seed selection process
CN114549441A (en) Sucker defect detection method based on image processing
Cheng et al. Superpixel classification based optic cup segmentation
CN115797473B (en) Concrete forming evaluation method for civil engineering
CN114820625A (en) Automobile top block defect detection method
CN107067430A (en) A kind of wheatland crop row detection method of distinguished point based cluster
CN107239761B (en) Fruit tree branch pulling effect evaluation method based on skeleton angular point detection
CN114494099A (en) Method and device for detecting rice processing precision and storage medium
CN114088624B (en) Equipment for detecting surface regularity of grain particles
CN114550167B (en) Artificial intelligence based pear quality classification method and device
Cheng et al. Superpixel classification based optic disc segmentation
CN114486877B (en) Rice quality detection method, rice quality detection device and storage medium
CN114088714A (en) Method for detecting surface regularity of grain particles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination