CN116503404B - Plastic toy quality detection method and device, electronic equipment and storage medium - Google Patents

Plastic toy quality detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116503404B
CN116503404B CN202310762311.0A CN202310762311A CN116503404B CN 116503404 B CN116503404 B CN 116503404B CN 202310762311 A CN202310762311 A CN 202310762311A CN 116503404 B CN116503404 B CN 116503404B
Authority
CN
China
Prior art keywords
window
pixel
sub
characteristic
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310762311.0A
Other languages
Chinese (zh)
Other versions
CN116503404A (en
Inventor
王雷
王新文
陈家旺
王翠丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liangshan County Innovative Crafts Co ltd
Original Assignee
Liangshan County Innovative Crafts Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liangshan County Innovative Crafts Co ltd filed Critical Liangshan County Innovative Crafts Co ltd
Priority to CN202310762311.0A priority Critical patent/CN116503404B/en
Publication of CN116503404A publication Critical patent/CN116503404A/en
Application granted granted Critical
Publication of CN116503404B publication Critical patent/CN116503404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Abstract

The application relates to the technical field of image processing, and provides a plastic toy quality detection method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a gray level image and a color space image of the plastic toy; constructing a pixel characteristic descriptor, and determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor; determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same; dividing the gray level image based on the optimal window, and detecting the quality of the plastic toy based on the divided gray level image. The application removes the influence of light shadow and obtains more accurate characteristic representing pixel points. And image segmentation is carried out through the optimal window, so that segmentation accuracy and quality detection accuracy are improved.

Description

Plastic toy quality detection method and device, electronic equipment and storage medium
Technical Field
The application relates to the field of image processing, in particular to a plastic toy quality detection method, a device, electronic equipment and a storage medium.
Background
The plastic toy is a common toy type, has the advantages of low price, convenient manufacture and the like, and is deeply favored by various big toy manufacturers.
The quality detection of the plastic toy is generally a nondestructive detection method based on computer vision, and the traditional computer vision detection algorithm is simple to realize and high in calculation efficiency, but needs to manually set various parameters, is difficult to accurately extract complex target characteristics, and has poor detection effect in complex scenes and various target classification scenes. The plastic toy is smaller, the structure is relatively complex, the plastic toy is easily influenced by light and shadow in the image acquisition process, the problem of inaccurate segmentation exists when the plastic toy is subjected to threshold segmentation, the problem of inaccurate segmentation can be solved to a certain extent by using a local binarization algorithm, but the algorithm has higher requirements on the selection of the size of a local window, the local window is selected by mistake, and the segmentation result can be influenced to a certain extent. Thus, improvements are needed in the art.
Disclosure of Invention
The application provides a plastic toy quality detection method, a plastic toy quality detection device, electronic equipment and a storage medium.
In a first aspect, the present application provides a method for detecting the quality of a plastic toy, comprising: obtaining an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image; constructing a pixel characteristic descriptor based on the gray level image and the color space image, and determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor, wherein the pixel characteristic value represents the color characteristic of each pixel point; determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same; dividing the gray level image based on the optimal window, and detecting the quality of the plastic toy based on the divided gray level image.
In an alternative embodiment, constructing a pixel feature descriptor based on the grayscale image and the color space image includes: constructing a feature descriptor based on the gray value of each pixel point in the gray image and the hue, saturation and brightness attribute of each pixel point in the color space image; the feature descriptors are; wherein ,representing pixel pointsIs provided with a pixel characteristic descriptor of (a),representing a hue,Represents saturation level,Representing brightness attribute,Representing pixel pointsGray values of (2); determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptors, wherein the pixel characteristic value comprises: and determining the pixel characteristic value of each pixel point in the image to be detected based on the weights of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor and the values of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor.
In an alternative embodiment, the following formula (1) is used to determine the pixel characteristic value of each pixel point in the image to be measured:
(1);
wherein Representing pixel pointsIs a pixel characteristic value of (1);a vector length representing a pixel feature descriptor;representing the weight of the i-th element in the pixel feature descriptor, when i is 1,representing pixel pointsThe weight of the hue H channel component value, when i is 2,representing pixel pointsThe weight of the saturation S-channel component value, when i is 3,representing pixel pointsThe weight of the luminance attribute V-channel component value, when i is 4,representing pixel pointsThe weight of the gray value;representing pixel points in a color space imageRepresenting pixel points in a gray scale imageRepresenting the value of the i-th element in the pixel feature descriptor.
In an alternative embodiment, calculating the total feature contribution rate of the sub-window based on the pixel feature values of the pixel points in the sub-window includes: calculating a sub-window characteristic value of the sub-window corresponding to each pixel point based on the difference of the pixel characteristic values of each pixel point in the sub-window; calculating a window feature difference based on the sub-window feature value; calculating the sub-window characteristic contribution rate of the sub-window corresponding to each pixel point based on the sub-window characteristic value and the window characteristic difference; and calculating the total characteristic contribution rate based on the sub-window characteristic contribution rate of each sub-window.
In an alternative embodiment, the sub-window feature value of the sub-window corresponding to each pixel is calculated using the following formula (2):
(2);
wherein Representing pixel pointsThe sub-window feature value of the sub-window in which it is located,representing pixel pointsThe side length of the sub-window where it is located,representing pixel pointsIs used to determine the pixel characteristic value of (1),representing pixel pointsIs a pixel characteristic value of (1);
calculating a window feature difference using the following formula (3):
(3);
wherein ,indicating that the window characteristic is poor,representing pixel pointsSub-window feature values of the corresponding sub-window;
calculating the sub-window characteristic contribution rate of each sub-window by using the following formula (4):
(4);
wherein ,representing pixel pointsThe sub-window characteristic contribution rate of the corresponding sub-window;
the total characteristic contribution rate is calculated by using the following formula (5):
(5);
wherein ,representing the sub-window feature contribution rate ordered from big to smallThe sub-window is provided with a window,representing the 1 st to the 1 st of the sub-window feature contribution rates after being ordered from big to smallThe sum of the sub-window feature contribution rates of the individual sub-windows, i.e. the total feature contribution rate,and representing the characteristic contribution rate of the sub-window according to the p-th sub-window after the sub-window characteristic contribution rate is ranked from large to small.
In an alternative embodiment, determining an optimal window based on the total feature contribution rate includes: determining pixel difference concentration in a window with a preset size based on the total characteristic contribution rate; determining the optimal window based on the pixel difference concentration;
dividing the gray level image based on the optimal window, detecting the quality of the plastic toy based on the divided gray level image, and comprising the following steps: processing the gray level image by using a local binarization algorithm to obtain a local binary image; splicing the local binary images to obtain a detection image of the plastic toy; comparing the detection image with a reference image to obtain similarity; if the similarity is larger than or equal to a threshold value, the plastic toy has no quality problem, and if the similarity is smaller than the threshold value, the plastic toy has quality problem.
In an alternative embodiment, determining the concentration of pixel differences in a window of a preset size based on the total feature contribution rate includes:
the pixel difference concentration is calculated using the following formula (6):
(6);
wherein ,indicating window side lengthChange toThe window side length in the process is set,indicated at window size ofWhen the window is in the window, the concentration of the pixel difference is within the window;indicated at window size ofWindow characteristic differences within the windowAs a result of the normalization,representing the number of pixel feature values in the sub-window when the total feature contribution rate of the sub-window within the window reaches a threshold.
In a second aspect, the present application provides a plastic toy quality inspection device comprising: the image acquisition module is used for acquiring an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image; the characteristic determining module is used for constructing a pixel characteristic descriptor based on the gray level image and the color space image, determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor, and representing the color characteristic of each pixel point; the window determining module is used for determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same; and the quality detection module is used for dividing the gray level image based on the optimal window and detecting the quality of the plastic toy based on the divided gray level image.
In a third aspect, the present application provides an electronic device, including a processor and a memory coupled to each other, where the memory is configured to store program instructions for implementing the method of any one of the above-mentioned aspects; the processor is configured to execute the program instructions stored in the memory.
In a fourth aspect, the present application provides a storage medium storing a program file executable to implement the method of any one of the above.
The beneficial effects of the application are as follows: compared with the prior art, the plastic toy quality detection method provided by the application comprises the following steps: obtaining an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image; constructing a pixel characteristic descriptor based on the gray level image and the color space image, and determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor, wherein the pixel characteristic value represents the color characteristic of each pixel point; determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same; dividing the gray level image based on the optimal window, and detecting the quality of the plastic toy based on the divided gray level image. The application constructs the characteristics of the same pixel point in the gray image and the color space image into the pixel characteristic descriptors, calculates the pixel characteristic values, removes the influence of light and shadow, and represents the characteristics of the pixel point more accurately. In addition, image segmentation is performed through the optimal window, so that segmentation accuracy is improved, and quality detection accuracy is improved.
Drawings
FIG. 1 is a flow chart of a first embodiment of a plastic toy quality inspection method of the present application;
FIG. 2 is a schematic diagram of determining a child window;
FIG. 3 is a schematic view of a plastic toy quality inspection device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an embodiment of an electronic device of the present application;
fig. 5 is a schematic structural view of a storage medium of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The present application will be described in detail with reference to the accompanying drawings and examples.
Referring to fig. 1, a flow chart of an embodiment of a plastic toy quality detection method according to the present application specifically includes:
step S11: and obtaining an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image.
Specifically, an original image of a plastic toy such as a gear building block in the production process is photographed by a CCD (charge coupled device) camera, and the obtained original image is an RGB image, and due to various external disturbances, the image quality is reduced, and analysis of the result is affected, so that denoising processing is performed on the original RGB image. The common denoising technology comprises bilateral filtering denoising, median filtering denoising, gaussian filtering denoising and the like, and the bilateral filtering denoising technology is used for reserving more detail information, so that an RGB denoising image is obtained, the denoised image is converted into a gray image and an HSV color space image and is recorded as an image F to be detected, namely the image F to be detected comprises the gray image and the HSV color space image.
Step S12: and constructing a pixel characteristic descriptor based on the gray level image and the color space image, and determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor, wherein the pixel characteristic value represents the color characteristic of each pixel point.
In step S12, first, a pixel feature descriptor needs to be constructed based on the gray-scale image and the color space image, which specifically includes: constructing a feature descriptor based on the gray value of each pixel point in the gray image and the hue, saturation and brightness attribute of each pixel point in the color space image; the feature descriptors are; wherein ,representing pixel pointsIs provided with a pixel characteristic descriptor of (a),representing a hue,Represents saturation level,Representing brightness attribute,Representing pixel pointsIs a gray value of (a).
In one embodiment, the gray image A corresponds to the pixels in the color space image B one by one, and the pixels in the gray image A are recorded asRepresenting the pixel point of the x-th row and y-th column in the gray image A, and marking the gray value of the pixel point asThe value range isTaking the pixel point asThe central window is noted asThe window size is noted asThe pixel points in the color space image B are recorded asA pixel point representing the x-th row and y-th column in the HSV image B, and a window centered on the pixel point is recorded asThe window size is noted asThe color space formed by hue H, saturation S and hue V of the pixel point is recorded asWherein the hue H has a value in the range ofThe range of the saturation S and the tone V is. Pixel point in image F to be measuredThe gray image A has gray value attribute, and the color space image B has hue H, saturation S and brightness V attribute, so that pixel point can be constructedIs written as a pixel characteristic descriptor of (1)
Further, after the pixel feature descriptors are determined, the pixel feature value of each pixel point in the image to be detected is further determined based on the pixel feature descriptors. The method specifically comprises the following steps: and determining the pixel characteristic value of each pixel point in the image to be detected based on the weights of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor and the values of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor.
In a specific embodiment, the following formula (1) is used to determine the pixel characteristic value of each pixel point in the image to be measured:
(1);
wherein Representing pixel pointsIs a pixel characteristic value of (1);the vector length of the pixel characteristic descriptors is represented, and an empirical value of 4 is taken in the application;representing the weight of the i-th element in the pixel feature descriptor, when i is 1,representing pixel pointsThe weight of the hue H channel component value, when i is 2,representing pixel pointsThe weight of the saturation S-channel component value, when i is 3,representing pixel pointsThe weight of the luminance attribute V-channel component value, when i is 4,representing pixel pointsThe weight of the gray value;representing pixel points in a color space imageRepresenting pixel points in a gray scale imageRepresenting the value of the i-th element in the pixel feature descriptor. In order to reduce the influence of illumination on color, the weight of the V channel component value is taken to be 0, namelyFor 0, the weights of the rest channel component values and gray values are respectively taken to be empirical values, namelyIs 0.33 part,Is 0.41,0.26.
Specifically, since in the RGB image, the R, G, B three channels have a certain correlation, when the RGB image is affected by illumination, the three channel values all change to a certain extent, so that the problem that the gray image a is unbalanced in color and has a large difference from the true color is caused, the threshold segmentation is directly performed, an erroneous segmentation result is possibly caused, in the color space image B, the brightness, the tone and the vividness of the color in the image can be intuitively represented, and the H, S, V three channels respectively reflect different color information and are relatively independent. According to the method, the gray value attribute of the plastic toy image in the gray image A and the hue, saturation and brightness attribute of the HSV image B are combined to calculate the pixel characteristic value, so that the influence of illumination on the color is reduced, and the actual color condition of the pixel point in the plastic toy image is represented.
Step S13: determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same.
Specifically, in step S13, each pixel point is taken as the vertex of the upper left corner to form oneIn the image F to be measured, the size of the sub-window with each pixel point as the center isThe sub-window is slid in the window from left to right and from top to bottom, so that the sub-window can be obtainedReferring to fig. 2, the sub-window is shown, in which α represents a pixel pointBeta represents a pixel pointIs a sub-window of (c).
After the sub-window is determined, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window.
Specifically, calculating the total feature contribution rate includes:
s131: and calculating the sub-window characteristic value of the sub-window corresponding to each pixel point based on the difference of the pixel characteristic values of each pixel point in the sub-window.
In a specific embodiment, the following formula (2) is used to calculate the characteristic value of the sub-window corresponding to each pixel point:
(2);
wherein Representing pixel pointsSub-window feature values of the sub-window where the pixel points are located, the larger the values are, the more the pixel points are representedThe larger the difference from the pixel point in the sub-window, the more information is contained in the sub-window, and the smaller the value is the pixel pointThe smaller the difference from the pixel point in the sub-window, the less the amount of information contained in the sub-window,representing pixel pointsThe side length of the sub-window is usually taken as an empirical value of 2.Representing pixel pointsIs used to determine the pixel characteristic value of (1),representing pixel pointsIs a pixel characteristic value of (a).
S132: and calculating a window characteristic difference based on the sub-window characteristic value.
In one embodiment, the window feature difference is calculated using the following equation (3):
(3);
wherein ,indicating that the window characteristic is poor,representing pixel pointsSub-window feature values of the corresponding sub-window. The larger the window characteristic difference is, the larger the information amount contained in the window is, the larger the difference between the pixel points is, the better the effect when the subsequent threshold segmentation is performed is, the smaller the window characteristic difference is, the smaller the information amount contained in the window is, the smaller the difference between the pixel points is, and the worse the effect when the subsequent threshold segmentation is performed is.
S133: and calculating the sub-window characteristic contribution rate of the sub-window corresponding to each pixel point based on the sub-window characteristic value and the window characteristic difference.
In a specific embodiment, the sub-window feature contribution rate of each sub-window is calculated using the following formula (4):
(4);
wherein ,representing pixel pointsIs used for determining the contribution rate of the sub-window characteristics of the corresponding sub-window. The larger the sub-window characteristic contribution rate is, the pixel point is representedThe larger the amount of information contained in the sub-window relative to the whole, the larger the pixel difference, the better the effect of threshold segmentation on the sub-window, and the smaller the pixel is representedThe smaller the amount of information contained in the sub-window relative to the whole, the smaller the pixel difference, and the poorer the effect when the sub-window is subjected to threshold segmentation.
S134, calculating the total characteristic contribution rate based on the sub-window characteristic contribution rate of each sub-window.
In one embodiment, the total feature contribution is calculated using the following equation (5):
(5);
wherein ,representing the sub-window feature contribution rate ordered from big to smallThe sub-window is provided with a window,representing the 1 st to the 1 st of the sub-window feature contribution rates after being ordered from big to smallThe sum of the sub-window feature contribution rates of the individual sub-windows, i.e. the total feature contribution rate,and representing the characteristic contribution rate of the sub-window according to the p-th sub-window after the sub-window characteristic contribution rate is ranked from large to small.
After the total feature contribution rate is obtained, an optimal window is further determined based on the total feature contribution rate. Specifically, determining the pixel difference concentration degree in a window with a preset size based on the total characteristic contribution rate; the optimal window is determined based on the pixel difference concentration. In one embodiment, the pixel difference concentration is calculated using the following equation (6):
(6);
wherein ,indicating window side lengthChange toThe window side length in the process is set,indicated at window size ofWhen the window is in the window, the concentration of the pixel difference is within the window;indicated at window size ofWindow characteristic differences within the windowThe normalization results were normalized in the Z_score method.Representing the number of pixel feature values in the sub-window when the total feature contribution rate of the sub-window within the window reaches a threshold.
Specifically, the initial size of the window is recorded asThe size of the window is usually an empirical value of 11, the window side length is sequentially increased by 2, and the maximum window isThe size of the window is usually an empirical value of 21, and the window size in the process of changing is recorded as the window sizeWhen the window side length is changed from 11×11 to 21×21, the values of n are 11, 13, …, and 21 in order. Cumulative sub-window feature contribution rate thresholdRecording the achievement of the characteristic contribution rate of the accumulated sub-window in the windowThe number of sub-window pixel eigenvalues at the time is
Specifically, when the pixel difference concentration is larger, it indicates that more information in the window is concentrated in fewer pixels, the effect is better when the window is subjected to threshold segmentation, and when the pixel difference concentration is smaller, it indicates that the information contained in each pixel in the window is less, and the effect is worse when the window is subjected to threshold segmentation. Cumulative sub-window feature contribution rate thresholdThe optimal window for local binarization is typically determined by taking an empirical value of 70%, i.e. comparing the proportion of windows containing 70% information content to the total window under different windows.
Step S14: dividing the gray level image based on the optimal window, and detecting the quality of the plastic toy based on the divided gray level image.
Specifically, a local binarization algorithm is utilized to process the gray level image, and a local binary image is obtained. In a specific embodiment, a local binarization Sauvola algorithm is used to calculate the mean value and standard deviation of the gray values of the pixel points in the window, a comparison factor K takes an empirical value of 0.5, and the maximum standard deviation R of the gray is taken an empirical value of 128, so that a segmentation threshold in the window can be calculated, the gray value of the central pixel point in the window is compared with the threshold, when the gray value of the central pixel point in the window is greater than the threshold, the gray value of the central pixel point in the window is assigned to 255, and when the gray value of the central pixel point in the window is less than or equal to the threshold, the gray value of the central pixel point in the window is assigned to 0. So far, the local binarization processing of the gray level image A is completed, and a local binary image C is obtained.
And splicing the local binary images C to obtain the detection image of the plastic toy. Specifically, a morphological corrosion and expansion algorithm is used for the binary image C, and fragments existing in the binary image C are connected to obtain a complete plastic toy area, namely a detection image.
And comparing the detection image with a reference image to obtain the similarity. Matching the plastic toy region in the local binary image C with the reference image by using an NNC normalized cross-correlation matching algorithm to obtain the similarity between the plastic toy in the binary image C and the reference image. It can be understood that the reference image is a plastic toy image with normal parameters.
If the similarity isAnd if the similarity is larger than or equal to the threshold value, the plastic toy has no quality problem, and if the similarity is smaller than the threshold value, the plastic toy has the quality problem. Specifically, the similarity is recorded as a threshold valueIf the similarity is smaller than the threshold, a certain quality problem exists, corresponding treatment is needed according to the existing problems, and if the similarity is larger than or equal to the threshold, no quality problem exists, and the packaging and selling can be performed. Threshold of similarityTypically an empirical value of 0.9 is taken.
The core idea of the Sauvola local binarization algorithm is to determine a local threshold value according to the mean value and the variance of the gray values of pixel points in a window taking each pixel point as the center, and binarize the central pixel point according to the size relation between the central pixel point and the local threshold value in the window to finish binarization of an image. The traditional Sauvola algorithm uses a fixed window size, the detail information of the plastic toy is more, and the light and shadow in the image are unevenly distributed due to the influence of illumination, when the traditional Sauvola algorithm uses the fixed window size, the problem of inaccurate segmentation exists, and the application uses an improved Sauvola local binarization algorithm, and the window size is changed fromTo the point ofIn the changing process of (2), selecting the most suitable window, completing the self-adaptive window size selection, and carrying out threshold segmentation on the plastic toy image so as to carry out quality detection.
According to the plastic toy quality detection method, the pixel characteristic descriptors are constructed on the characteristics of the same pixel point in the gray level image A and the color space image B, the pixel characteristic values are obtained through calculation, the influence of light and shadow is removed by the characteristic values, the characteristics of the pixel points are represented more accurately, the information quantity and the difference degree contained in each pixel point are represented through the pixel characteristic differences, the window characteristic differences and the sub-window characteristic contribution rate of each pixel point in a constructed window, the difference concentration degree of the pixel points in the window is described more accurately based on the sub-constructed pixel difference concentration degree, the difference concentration degree is larger, the window is more suitable for threshold segmentation, the optimal window for local threshold segmentation is obtained according to the threshold segmentation, the segmentation accuracy is improved by utilizing the optimal window, and the accuracy of quality detection results is further improved.
Referring to fig. 3, a schematic structural diagram of an embodiment of a plastic toy quality detecting device according to the present application specifically includes: an image acquisition module 31, a feature determination module 32, a window determination module 33, and a quality detection module 34.
The image acquisition module 31 is used for acquiring an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image. The feature determining module 32 is configured to construct a pixel feature descriptor based on the gray-scale image and the color space image, and determine a pixel feature value of each pixel point in the image to be measured based on the pixel feature descriptor, where the pixel feature value characterizes a color feature of each pixel point. The window determining module 33 is configured to determine a sub-window corresponding to each pixel point in the image to be measured, calculate a total feature contribution rate of the sub-window based on pixel feature values of the pixel points in the sub-window, and determine an optimal window based on the total feature contribution rate; the number of pixel points in each sub-window is the same. The quality detection module 34 is configured to segment the gray-scale image based on the optimal window, and detect the quality of the plastic toy based on the segmented gray-scale image.
The execution steps of each module in this embodiment are shown in fig. 1, and are not repeated here.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the application. The electronic device comprises a memory 52 and a processor 51 connected to each other.
The memory 52 is used to store program instructions for implementing the method of any of the above.
The processor 51 is operative to execute program instructions stored in the memory 52.
The processor 51 may also be referred to as a CPU (Central Processing Unit ). The processor 51 may be an integrated circuit chip with signal processing capabilities. Processor 51 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 52 may be a memory bank, TF card, etc., and may store all information in the electronic device, including input raw data, computer programs, intermediate operation results, and final operation results, all stored in the memory. It stores and retrieves information according to the location specified by the controller. With the memory, the electronic equipment has a memory function and can ensure normal operation. The memories in electronic devices can be classified into main memories (memories) and auxiliary memories (external memories) according to the purpose, and also classified into external memories and internal memories. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory refers to a storage component on the motherboard for storing data and programs currently being executed, but is only used for temporarily storing programs and data, and the data is lost when the power supply is turned off or the power is turned off.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a system server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the method of the embodiments of the present application.
Fig. 5 is a schematic structural diagram of a storage medium according to the present application. The storage medium of the present application stores a program file 61 capable of implementing all the methods described above, wherein the program file 61 may be stored in the storage medium in the form of a software product, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. The aforementioned storage device includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes, or a terminal device such as a computer, a server, a mobile phone, a tablet, or the like.
The foregoing is only the embodiments of the present application, and therefore, the patent scope of the application is not limited thereto, and all equivalent structures or equivalent processes using the descriptions of the present application and the accompanying drawings, or direct or indirect application in other related technical fields, are included in the scope of the application.

Claims (7)

1. A method for detecting the quality of a plastic toy, comprising the steps of:
obtaining an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image;
constructing a pixel characteristic descriptor based on the gray level image and the color space image, and determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor, wherein the pixel characteristic value represents the color characteristic of each pixel point;
determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same;
dividing the gray level image based on the optimal window, and detecting the quality of the plastic toy based on the divided gray level image;
constructing a pixel feature descriptor based on the grayscale image and the color space image, comprising:
constructing a feature descriptor based on the gray value of each pixel point in the gray image and the hue, saturation and brightness attribute of each pixel point in the color space image; the feature descriptors are; wherein ,/>Representing pixel dot +.>Pixel feature descriptors of +.>Indicating hue and->Representing saturation, < >>Representing brightness attributes->Representing pixel dot +.>Gray values of (2);
determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptors, wherein the pixel characteristic value comprises:
determining a pixel characteristic value of each pixel point in the image to be detected based on the weights of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor and the values of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor;
calculating the total feature contribution rate of the sub-window based on the pixel feature values of the pixel points in the sub-window, including:
calculating a sub-window characteristic value of the sub-window corresponding to each pixel point based on the difference of the pixel characteristic values of each pixel point in the sub-window;
calculating a window feature difference based on the sub-window feature value;
calculating the sub-window characteristic contribution rate of the sub-window corresponding to each pixel point based on the sub-window characteristic value and the window characteristic difference;
calculating a total characteristic contribution rate based on the characteristic contribution rate of each sub-window;
calculating a sub-window characteristic value of the sub-window corresponding to each pixel point by using the following formula (2):
(2);
wherein Representing pixel dot +.>Sub-window feature value of the sub-window in which +.>Representing pixel dot +.>Side length of the sub-window where +.>Representing pixel dot +.>Pixel characteristic value of>Representing pixel dot +.>Is a pixel characteristic value of (1);
calculating a window feature difference using the following formula (3):
(3);
wherein ,representing window characteristic difference, ++>Representing pixel dot +.>Sub-window feature values of the corresponding sub-window;
calculating the sub-window characteristic contribution rate of each sub-window by using the following formula (4):
(4);
wherein ,representing pixel dot +.>The sub-window characteristic contribution rate of the corresponding sub-window;
the total characteristic contribution rate is calculated by using the following formula (5):
(5);
wherein ,representing the +.sub-window feature contribution rate ordered from big to small>The sub-window is provided with a window,representing the 1 st to the ++th after ranking the sub-window feature contribution rates from big to small>The sum of the sub-window feature contribution rates of the individual sub-windows, i.e. the total feature contribution rate, +.>And representing the characteristic contribution rate of the sub-window according to the p-th sub-window after the sub-window characteristic contribution rate is ranked from large to small.
2. The method of claim 1, wherein the pixel eigenvalues of each pixel point in the image to be measured are determined using the following formula (1):
(1);
wherein Representing pixel dot +.>Is a pixel characteristic value of (1); />A vector length representing a pixel feature descriptor;representing the weight of the i-th element in the pixel feature descriptor, when i is 1,/is->Representing pixel dot +.>Weight of hue H channel component value, when i is 2,/is>Representing pixel dot +.>Weight of saturation S channel component value, when i is 3,/is>Representing pixel dot +.>Weight of luminance attribute V channel component value, when i is 4,/is->Representation imageBasic point->The weight of the gray value; />Representing pixel points in a color space image +.>,/>Representing pixel points in a gray scale image>Representing the value of the i-th element in the pixel feature descriptor.
3. The method of claim 1, wherein determining an optimal window based on the total feature contribution rate comprises:
determining pixel difference concentration in a window with a preset size based on the total characteristic contribution rate;
determining the optimal window based on the pixel difference concentration;
dividing the gray level image based on the optimal window, detecting the quality of the plastic toy based on the divided gray level image, and comprising the following steps:
processing the gray level image by using a local binarization algorithm to obtain a local binary image;
splicing the local binary images to obtain a detection image of the plastic toy;
comparing the detection image with a reference image to obtain similarity;
if the similarity is larger than or equal to a threshold value, the plastic toy has no quality problem, and if the similarity is smaller than the threshold value, the plastic toy has quality problem.
4. A method according to claim 3, wherein determining the concentration of pixel differences in a window of a preset size based on the total feature contribution rate comprises:
the pixel difference concentration is calculated using the following formula (6):
(6);
wherein ,indicating window side length from +.>Change to->Window side length in procedure, < >>Indicated at window size +.>When the window is in the window, the concentration of the pixel difference is within the window; />Indicated at window size +.>Window characteristic difference in window>Normalized results,/->Total feature tribute representing child windows in windowThe number of pixel eigenvalues in the sub-window when the contribution rate reaches the threshold.
5. A plastic toy quality inspection device, comprising:
the image acquisition module is used for acquiring an image to be detected of the plastic toy, wherein the image to be detected comprises a gray level image and a color space image;
the characteristic determining module is used for constructing a pixel characteristic descriptor based on the gray level image and the color space image, determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptor, and representing the color characteristic of each pixel point;
the window determining module is used for determining a sub-window corresponding to each pixel point in the image to be detected, calculating the total characteristic contribution rate of the sub-window based on the pixel characteristic values of the pixel points in the sub-window, and determining an optimal window based on the total characteristic contribution rate; the number of pixel points in each sub-window is the same;
the quality detection module is used for dividing the gray level image based on the optimal window and detecting the quality of the plastic toy based on the divided gray level image;
constructing a pixel feature descriptor based on the grayscale image and the color space image, comprising:
constructing a feature descriptor based on the gray value of each pixel point in the gray image and the hue, saturation and brightness attribute of each pixel point in the color space image; the feature descriptors are; wherein ,/>Representing pixel dot +.>Pixel feature descriptors of +.>Indicating hue and->Representing saturation, < >>Representing brightness attributes->Representing pixel dot +.>Gray values of (2);
determining a pixel characteristic value of each pixel point in the image to be detected based on the pixel characteristic descriptors, wherein the pixel characteristic value comprises:
determining a pixel characteristic value of each pixel point in the image to be detected based on the weights of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor and the values of the hue, the saturation, the brightness attribute and the gray value in the pixel characteristic descriptor;
calculating the total feature contribution rate of the sub-window based on the pixel feature values of the pixel points in the sub-window, including:
calculating a sub-window characteristic value of the sub-window corresponding to each pixel point based on the difference of the pixel characteristic values of each pixel point in the sub-window;
calculating a window feature difference based on the sub-window feature value;
calculating the sub-window characteristic contribution rate of the sub-window corresponding to each pixel point based on the sub-window characteristic value and the window characteristic difference;
calculating a total characteristic contribution rate based on the characteristic contribution rate of each sub-window;
calculating a sub-window characteristic value of the sub-window corresponding to each pixel point by using the following formula (2):
(2);
wherein Representing pixel dot +.>Sub-window feature value of the sub-window in which +.>Representing pixel dot +.>Side length of the sub-window where +.>Representing pixel dot +.>Is used to determine the pixel characteristic value of (1),representing pixel dot +.>Is a pixel characteristic value of (1);
calculating a window feature difference using the following formula (3):
(3);
wherein ,representing window characteristic difference, ++>Representing pixel dot +.>Sub-window feature values of the corresponding sub-window;
calculating the sub-window characteristic contribution rate of each sub-window by using the following formula (4):
(4);
wherein ,representing pixel dot +.>The sub-window characteristic contribution rate of the corresponding sub-window;
the total characteristic contribution rate is calculated by using the following formula (5):
(5);
wherein ,representing the +.sub-window feature contribution rate ordered from big to small>The sub-window is provided with a window,representing the 1 st to the ++th after ranking the sub-window feature contribution rates from big to small>The sum of the sub-window feature contribution rates of the individual sub-windows, i.e. the total feature contribution rate, +.>Representing sub-window feature contribution rates in terms of large to smallAnd the contribution rate of the sub-window characteristics of the p th sub-window after sequencing.
6. An electronic device, the electronic device comprising: a processor and a memory coupled to each other;
the memory is used for storing program instructions for implementing the method according to any one of claims 1-4;
the processor is configured to execute the program instructions stored in the memory.
7. A storage medium storing a program file executable to implement the method of any one of claims 1 to 4.
CN202310762311.0A 2023-06-27 2023-06-27 Plastic toy quality detection method and device, electronic equipment and storage medium Active CN116503404B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310762311.0A CN116503404B (en) 2023-06-27 2023-06-27 Plastic toy quality detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310762311.0A CN116503404B (en) 2023-06-27 2023-06-27 Plastic toy quality detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116503404A CN116503404A (en) 2023-07-28
CN116503404B true CN116503404B (en) 2023-09-01

Family

ID=87325171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310762311.0A Active CN116503404B (en) 2023-06-27 2023-06-27 Plastic toy quality detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116503404B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016127883A1 (en) * 2015-02-12 2016-08-18 阿里巴巴集团控股有限公司 Image area detection method and device
WO2017092431A1 (en) * 2015-12-01 2017-06-08 乐视控股(北京)有限公司 Human hand detection method and device based on skin colour
WO2022100048A1 (en) * 2020-11-11 2022-05-19 海宁奕斯伟集成电路设计有限公司 Image processing method and apparatus, electronic device, and readable storage medium
CN115311270A (en) * 2022-10-11 2022-11-08 南通至顺聚氨酯材料有限公司 Plastic product surface defect detection method
CN115439494A (en) * 2022-11-08 2022-12-06 山东大拇指喷雾设备有限公司 Spray image processing method for quality inspection of sprayer
CN115457041A (en) * 2022-11-14 2022-12-09 安徽乾劲企业管理有限公司 Road quality identification and detection method
CN115937204A (en) * 2023-01-09 2023-04-07 江苏惠汕新能源集团有限公司 Welded pipe production quality detection method
CN116188462A (en) * 2023-04-24 2023-05-30 深圳市翠绿贵金属材料科技有限公司 Noble metal quality detection method and system based on visual identification
CN116309559A (en) * 2023-05-17 2023-06-23 山东鲁玻玻璃科技有限公司 Intelligent identification method for production flaws of medium borosilicate glass
CN116309570A (en) * 2023-05-18 2023-06-23 山东亮马新材料科技有限公司 Titanium alloy bar quality detection method and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016127883A1 (en) * 2015-02-12 2016-08-18 阿里巴巴集团控股有限公司 Image area detection method and device
WO2017092431A1 (en) * 2015-12-01 2017-06-08 乐视控股(北京)有限公司 Human hand detection method and device based on skin colour
WO2022100048A1 (en) * 2020-11-11 2022-05-19 海宁奕斯伟集成电路设计有限公司 Image processing method and apparatus, electronic device, and readable storage medium
CN115311270A (en) * 2022-10-11 2022-11-08 南通至顺聚氨酯材料有限公司 Plastic product surface defect detection method
CN115439494A (en) * 2022-11-08 2022-12-06 山东大拇指喷雾设备有限公司 Spray image processing method for quality inspection of sprayer
CN115457041A (en) * 2022-11-14 2022-12-09 安徽乾劲企业管理有限公司 Road quality identification and detection method
CN115937204A (en) * 2023-01-09 2023-04-07 江苏惠汕新能源集团有限公司 Welded pipe production quality detection method
CN116188462A (en) * 2023-04-24 2023-05-30 深圳市翠绿贵金属材料科技有限公司 Noble metal quality detection method and system based on visual identification
CN116309559A (en) * 2023-05-17 2023-06-23 山东鲁玻玻璃科技有限公司 Intelligent identification method for production flaws of medium borosilicate glass
CN116309570A (en) * 2023-05-18 2023-06-23 山东亮马新材料科技有限公司 Titanium alloy bar quality detection method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图像分块的局部阈值二值化方法;张洁玉;;计算机应用(第03期);全文 *

Also Published As

Publication number Publication date
CN116503404A (en) 2023-07-28

Similar Documents

Publication Publication Date Title
CN110276754B (en) Surface defect detection method, terminal device and storage medium
CN108764358B (en) Terahertz image identification method, device and equipment and readable storage medium
CN111401324A (en) Image quality evaluation method, device, storage medium and electronic equipment
CN111639629B (en) Pig weight measurement method and device based on image processing and storage medium
CN112308854B (en) Automatic detection method and system for chip surface flaws and electronic equipment
JP2018081442A (en) Learned model generating method and signal data discrimination device
CN111369605A (en) Infrared and visible light image registration method and system based on edge features
CN111931751A (en) Deep learning training method, target object identification method, system and storage medium
CN113609984A (en) Pointer instrument reading identification method and device and electronic equipment
CN114723677A (en) Image defect detection method, image defect detection device, image defect detection equipment and storage medium
CN110766657B (en) Laser interference image quality evaluation method
CN116559111A (en) Sorghum variety identification method based on hyperspectral imaging technology
CN113743378B (en) Fire monitoring method and device based on video
CN116503404B (en) Plastic toy quality detection method and device, electronic equipment and storage medium
CN114998980B (en) Iris detection method and device, electronic equipment and storage medium
CN113450323B (en) Quality detection method and device, electronic equipment and computer readable storage medium
CN111523605B (en) Image identification method and device, electronic equipment and medium
CN113750440B (en) Method and system for identifying and counting rope skipping data
CN111242047A (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
WO2020107196A1 (en) Photographing quality evaluation method and apparatus for photographing apparatus, and terminal device
CN111222504A (en) Bullet hole target scoring method, device, equipment and medium
CN117557820B (en) Quantum dot optical film damage detection method and system based on machine vision
CN113688845B (en) Feature extraction method and device suitable for hyperspectral remote sensing image and storage medium
US11232289B2 (en) Face identification method and terminal device using the same
CN111914632B (en) Face recognition method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant