CN115063605B - Method for identifying color printing package by using electronic equipment - Google Patents

Method for identifying color printing package by using electronic equipment Download PDF

Info

Publication number
CN115063605B
CN115063605B CN202210980250.0A CN202210980250A CN115063605B CN 115063605 B CN115063605 B CN 115063605B CN 202210980250 A CN202210980250 A CN 202210980250A CN 115063605 B CN115063605 B CN 115063605B
Authority
CN
China
Prior art keywords
gradient
information
gray
image
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210980250.0A
Other languages
Chinese (zh)
Other versions
CN115063605A (en
Inventor
王卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Zhuoyue Digital Technology Co ltd
Original Assignee
Nantong Zhuoyue Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Zhuoyue Digital Technology Co ltd filed Critical Nantong Zhuoyue Digital Technology Co ltd
Priority to CN202210980250.0A priority Critical patent/CN115063605B/en
Publication of CN115063605A publication Critical patent/CN115063605A/en
Application granted granted Critical
Publication of CN115063605B publication Critical patent/CN115063605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a method for identifying color printing packages by utilizing electronic equipment. The method obtains the retention degree of the edge information of the current gray level image through the difference of the gradient information in the HIS image and the gray level image. And analyzing the first feature descriptors of the key points in the gray-scale image to obtain the feature information retention degree of each first feature descriptor. And obtaining recommendation degree according to the retention degree of the edge information and the retention degree of the feature information. And further selecting an optimal standard gray scale conversion formula for processing the color printing packaging image to be detected and the template color printing packaging image, and realizing efficient matching identification. The embodiment of the invention optimizes the detection algorithm of the sampling package by the artificial intelligence optimization operation system under the artificial intelligence system in the production field, increases the identification detection efficiency and the identification detection precision of the color printing package product, and realizes the identification detection of the mass products under the color printing template by using a standard gray scale conversion formula and computer vision software.

Description

Method for identifying color printing package by using electronic equipment
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method for identifying color printing packages by utilizing electronic equipment.
Background
With the development of the technology in the production field, people have higher requirements on the quality of the produced products. The packaging of the product is an important component of the product, and the quality of the product is also important for detection in the production field. The color printing package directly influences the impression and the purchasing power of consumers on products, and after the color printing template is determined, the color printing package products are ensured to be matched with the color printing template in the production process of the package.
In the color printing packaging quality detection, the manual detection has low precision and low efficiency, and is not suitable for the development of the automatic production field. Therefore, in the field of automatic production, it is necessary to extract the features of the color printing package image by computer vision technology and match the features with the color printing template. A commonly used matching algorithm is a feature detection algorithm (SIFT algorithm) for interest points, and the SIFT algorithm performs matching according to feature information of key points by searching the key points in different scale spaces. The color-rich SIFT algorithm in the color printing package has large calculation amount, so that the color printing package image can be converted into a gray image, the calculation load of the SIFT algorithm is simplified, but the gray image can cause the loss of edge information of a color image and influence the subsequent matching result.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a method for identifying a color-printed package by using an electronic device, which adopts the following technical solutions:
the invention provides a method for identifying color printing packages by using electronic equipment, which comprises the following steps:
obtaining a color printing package image; converting the sampling package image into an HIS color space to obtain an HIS image; obtaining a gray level image of the color printing packaging image according to a preset gray level conversion formula; obtaining gray gradient information in the gray image; acquiring HIS gradient information according to the difference of color difference, the difference of saturation and the difference of brightness among the pixel points in the HIS image; obtaining the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information;
extracting a plurality of first feature descriptors in the gray image according to an SIFT algorithm; obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions; obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors;
obtaining the recommendation degree of the gray level image according to the edge information retention degree and the feature information retention degree; changing parameters in the gray scale conversion formula to obtain a plurality of recommendation degrees, and taking the gray scale conversion formula corresponding to the maximum recommendation degree as a standard gray scale conversion formula;
acquiring a color printing package image to be detected; obtaining a to-be-detected gray level image of the to-be-detected color printing packaging image according to the standard gray level conversion formula; obtaining a template gray level image of the template color printing packaging image according to the standard gray level conversion formula; and acquiring and matching the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm, and identifying the color printing packaging image to be detected according to a matching result.
Further, the obtaining gray scale gradient information in the gray scale image comprises:
calculating the transverse gray gradient and the longitudinal gray gradient of each pixel point in the gray image according to a Soble operator, and specifically comprising the following steps:
Figure 374729DEST_PATH_IMAGE001
Figure 886613DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 570535DEST_PATH_IMAGE003
is a pixel point
Figure 440140DEST_PATH_IMAGE004
Is determined by the lateral gray scale gradient of (a),
Figure 93975DEST_PATH_IMAGE005
is a pixel point
Figure 866890DEST_PATH_IMAGE004
The longitudinal gray scale gradient is set to be,
Figure 846347DEST_PATH_IMAGE006
Figure 937669DEST_PATH_IMAGE007
Figure 395195DEST_PATH_IMAGE008
Figure 22617DEST_PATH_IMAGE009
Figure 907396DEST_PATH_IMAGE010
Figure 486014DEST_PATH_IMAGE011
Figure 481652DEST_PATH_IMAGE012
and
Figure 822634DEST_PATH_IMAGE013
is a pixel point
Figure 127583DEST_PATH_IMAGE004
Eight neighborhood images ofA prime value;
and obtaining a gray gradient amplitude and a gray gradient direction according to the transverse gray gradient and the longitudinal gray gradient of each pixel point, and taking the gray gradient amplitude and the gray gradient direction as the gray gradient information.
Further, the obtaining of the HIS gradient information according to the color difference, the saturation difference and the brightness difference between the pixels in the HIS image includes:
replacing the difference of pixel values in the transverse gray gradient and the longitudinal gray gradient with HIS difference to obtain transverse HIS difference and longitudinal HIS difference; the HIS differences include:
Figure 944229DEST_PATH_IMAGE014
wherein the content of the first and second substances,
Figure 494290DEST_PATH_IMAGE015
is as follows
Figure 814413DEST_PATH_IMAGE016
Pixel point and the second
Figure 290263DEST_PATH_IMAGE017
The HIS differences of the individual pixels are,
Figure 594205DEST_PATH_IMAGE018
is as follows
Figure 947957DEST_PATH_IMAGE016
The hue information of each pixel point is obtained,
Figure 653745DEST_PATH_IMAGE019
is as follows
Figure 566075DEST_PATH_IMAGE017
The hue information of each pixel point is obtained,
Figure 91734DEST_PATH_IMAGE020
is as follows
Figure 249177DEST_PATH_IMAGE016
The information of the saturation of each pixel point,
Figure 543892DEST_PATH_IMAGE021
is as follows
Figure 158282DEST_PATH_IMAGE017
The saturation information for each pixel point is determined,
Figure 249866DEST_PATH_IMAGE022
is as follows
Figure 795290DEST_PATH_IMAGE016
The brightness information of each pixel point is obtained,
Figure 695244DEST_PATH_IMAGE024
is as follows
Figure 542852DEST_PATH_IMAGE017
The brightness information of the individual pixels;
and obtaining an HIS gradient amplitude and an HIS gradient direction according to the transverse HIS gradient and the longitudinal HIS gradient of each pixel point, and taking the HIS gradient amplitude and the HIS gradient direction as HIS gradient information.
Further, the obtaining of the retention degree of the edge information of the gray scale image according to the difference between the HIS gradient information and the gray scale gradient information includes:
obtaining the edge information retention degree according to an edge information retention degree formula, wherein the edge information retention degree formula comprises:
Figure 777524DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure 11190DEST_PATH_IMAGE026
degree of retention for said edge information,
Figure 123240DEST_PATH_IMAGE027
The size of the color-printed package image,
Figure 767848DEST_PATH_IMAGE028
is a coordinate of
Figure 506128DEST_PATH_IMAGE029
The HIS gradient magnitude of the pixel point of (a),
Figure 527174DEST_PATH_IMAGE030
is a coordinate of
Figure 165835DEST_PATH_IMAGE029
The gray scale gradient magnitude of the pixel point of (a),
Figure 246923DEST_PATH_IMAGE031
is as a coordinate of
Figure 472499DEST_PATH_IMAGE029
The direction of the HIS gradient of the pixel points of (a),
Figure 562815DEST_PATH_IMAGE032
as a coordinate of
Figure 55982DEST_PATH_IMAGE029
The gray scale gradient direction of the pixel point of (1).
Further, the obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions includes:
acquiring a plurality of eigenvalues of a covariance matrix of the first characteristic descriptor, and arranging the eigenvalues from large to small to obtain an eigenvalue sequence; according to a preset selection quantity, a plurality of front characteristic values in the characteristic value sequence are used as reference characteristic values;
and acquiring a reference feature vector corresponding to a reference feature value, and multiplying the first feature descriptor by the reference feature vector to obtain a plurality of second feature descriptors.
Further, the obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors comprises
Obtaining the feature information retention degree according to a feature information retention degree formula, wherein the feature information retention degree formula comprises:
Figure 573551DEST_PATH_IMAGE033
wherein, the first and the second end of the pipe are connected with each other,
Figure 879899DEST_PATH_IMAGE034
for the degree of retention of said characteristic information,
Figure 757594DEST_PATH_IMAGE035
is a natural constant and is a natural constant,
Figure 856000DEST_PATH_IMAGE036
for the purpose of said selection of the number,
Figure 295202DEST_PATH_IMAGE037
is as follows
Figure 947901DEST_PATH_IMAGE038
A second set of said second feature descriptors,
Figure 878554DEST_PATH_IMAGE039
is as follows
Figure 972412DEST_PATH_IMAGE038
The feature values corresponding to the second feature descriptors, U being the number of the feature values,
Figure 940106DEST_PATH_IMAGE040
is a first
Figure 96412DEST_PATH_IMAGE041
And (4) the characteristic value.
Further, the obtaining of the recommendation degree of the grayscale image according to the edge information retention degree and the feature information retention degree includes:
obtaining the recommendation degree according to a recommendation degree formula, wherein the recommendation degree formula comprises:
Figure 801063DEST_PATH_IMAGE042
wherein the content of the first and second substances,
Figure 857749DEST_PATH_IMAGE043
in order to be the degree of recommendation,
Figure 232230DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 390679DEST_PATH_IMAGE044
is as follows
Figure 148288DEST_PATH_IMAGE045
The feature information retention of each of the first feature descriptors,
Figure 75793DEST_PATH_IMAGE046
is the number of the first feature descriptors.
The invention has the following beneficial effects:
according to the embodiment of the invention, the HIS gradient information and the gray gradient information of the color printing packaging image are obtained at the same time, and the edge information retention degree of the gray image under the current gray conversion formula is obtained according to the difference of the two gradient information. And further extracting a first feature descriptor of the gray image by utilizing a SIFT algorithm, and obtaining the feature information retention of the first feature descriptor of the gray image under the current gray conversion formula through the projection information of the first feature descriptor in a plurality of principal component directions. And obtaining the recommendation degree of the gray level image under the current gray level conversion formula according to the edge information retention degree and the characteristic information retention degree. And selecting the parameters of the gray level conversion formula according to the recommendation degree, and finally obtaining a gray level image with clear edge information and clear feature information, so that the gray level image is convenient to match with the template gray level image. The embodiment of the invention optimizes the detection algorithm of the sampling package by the artificial intelligence optimization operation system under the artificial intelligence system in the production field, increases the identification detection efficiency and the identification detection precision of the color printing package product, and realizes the identification detection of the mass products under the color printing template by using a standard gray scale conversion formula and computer vision software.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for identifying a color-printed package by an electronic device according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the method for identifying color-printed packages by electronic devices according to the present invention with reference to the accompanying drawings and preferred embodiments shows the following detailed descriptions of the specific implementation, structure, features and effects thereof. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the method for identifying color-printed packages by using electronic equipment in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of a method for identifying a color-printed package by an electronic device according to an embodiment of the present invention is shown, the method including:
step S1: obtaining a color printing package image; converting the sampling package image into an HIS color space to obtain an HIS image; obtaining a gray level image of the color printing package image according to a preset gray level conversion formula; obtaining gray gradient information in a gray image; acquiring HIS gradient information according to color difference, saturation difference and brightness difference among pixel points in the HIS image; and obtaining the edge information retention degree of the gray level image according to the difference between the HIS gradient information and the gray level gradient information.
In the embodiment of the invention, the product conveyor belt is arranged in the production scene of the color printing package, the produced color printing package product is placed on the conveyor belt, and the electronic equipment is arranged above the conveyor belt and comprises the detection device and the rack for fixing the detection device. The detection device comprises: an imaging device, such as an industrial camera, for imaging and imaging data output of the color printed packaged product in the area of the conveyor belt; a light source device for performing brightness compensation on the imaging device; the sensing device is used for sensing the detected color printing package and outputting a sensing signal; and the central processing unit is used for receiving the induction signals, realizing corresponding data processing and is electrically connected with the imaging equipment and the induction device.
When the sensing device senses that the color-printed packaging products on the conveyor belt are conveyed to the lower part of the electronic equipment, the sensing signal is transmitted to the central processing unit, the central processing unit controls the light source equipment and the imaging equipment to acquire image data of the color-printed packaging products, and the imaging equipment transmits the image data to the central processing unit for corresponding data processing.
It should be noted that, in the production field, the color-printed packaging products are usually produced in large quantities, that is, the color-printed packaging products for quality inspection are all in a color-printed style, corresponding to a color-printed template.
And acquiring a color printing package image. The color printed package image is converted into the HIS color space. It should be noted that the color printing package image collected by the electronic device is usually an RGB image, and the conversion of the RGB image into the HIS color space is well known in the art and will not be described herein.
Obtaining a gray level image of the color printing packaging image according to a preset gray level conversion formula, wherein the gray level conversion formula comprises the following steps:
Figure 621175DEST_PATH_IMAGE047
wherein the content of the first and second substances,
Figure 250608DEST_PATH_IMAGE048
is a gray value in a gray-scale image,
Figure 828220DEST_PATH_IMAGE049
for the channel value of the red channel in the color printed package image,
Figure 95384DEST_PATH_IMAGE050
for the channel value of the green channel in the color printed package image,
Figure 467460DEST_PATH_IMAGE051
for the channel value of the blue channel in the color printed package image,
Figure 849768DEST_PATH_IMAGE034
Figure 965492DEST_PATH_IMAGE052
and
Figure 352742DEST_PATH_IMAGE053
for the weight parameters of the corresponding channel, in conventional gray scale conversion,
Figure 895719DEST_PATH_IMAGE054
Figure 624378DEST_PATH_IMAGE055
Figure 763367DEST_PATH_IMAGE056
when a color printed packaging image is converted into a gray image, information in the gray image is incomplete and edge information is lost due to disappearance of hue information. The HIS image comprises information of hue, saturation and brightness, and the edge information retention degree of the current gray level image can be obtained by comparing the information in the HIS image with the information of the current gray level image.
In order to obtain the edge information retention degree, it is first necessary to acquire edge gradient information of the HIS image and the grayscale image. Preferably, calculating the horizontal gray gradient and the vertical gray gradient of each pixel point in the gray image by using a Soble operator, and specifically comprises:
Figure 519970DEST_PATH_IMAGE001
Figure 483116DEST_PATH_IMAGE057
wherein the content of the first and second substances,
Figure 325170DEST_PATH_IMAGE003
is a pixel point
Figure 267849DEST_PATH_IMAGE004
The lateral gray scale gradient of (a) is,
Figure 252860DEST_PATH_IMAGE005
is a pixel point
Figure 278585DEST_PATH_IMAGE004
A gradient of the gray levels in the longitudinal direction,
Figure 420985DEST_PATH_IMAGE006
Figure 682202DEST_PATH_IMAGE007
Figure 545157DEST_PATH_IMAGE008
Figure 351570DEST_PATH_IMAGE009
Figure 168217DEST_PATH_IMAGE010
Figure 108491DEST_PATH_IMAGE011
Figure 943460DEST_PATH_IMAGE012
and
Figure 435622DEST_PATH_IMAGE013
is a pixel point
Figure 490296DEST_PATH_IMAGE004
The eight neighborhood pixel values of (a).
In the gray scale gradient information calculation process, the difference of pixel values of other pixel points in the neighborhood of a target pixel point is reflected in the transverse gray scale gradient and the longitudinal gray scale gradient, and when HIS gradient information is analyzed, the hue difference, the saturation difference and the brightness difference need to be considered at the same time, and the method specifically comprises the following steps:
replacing the pixel value difference in the transverse gray gradient and the longitudinal gray gradient with an HIS difference to obtain the transverse HIS difference and the longitudinal HIS difference, wherein the HIS difference comprises:
Figure 732797DEST_PATH_IMAGE058
wherein the content of the first and second substances,
Figure 923738DEST_PATH_IMAGE015
is a first
Figure 321221DEST_PATH_IMAGE016
Pixel point and the second
Figure 361727DEST_PATH_IMAGE017
The HIS difference of the individual pixel points,
Figure 502858DEST_PATH_IMAGE018
is a first
Figure 79464DEST_PATH_IMAGE016
The hue information of each pixel point is calculated,
Figure 647849DEST_PATH_IMAGE019
is a first
Figure 644493DEST_PATH_IMAGE017
The hue information of each pixel point is obtained,
Figure 589315DEST_PATH_IMAGE020
is as follows
Figure 613903DEST_PATH_IMAGE016
The information of the saturation of each pixel point,
Figure 602456DEST_PATH_IMAGE021
is as follows
Figure 102708DEST_PATH_IMAGE017
The information of the saturation of each pixel point,
Figure 336374DEST_PATH_IMAGE022
is a first
Figure 871261DEST_PATH_IMAGE016
The luminance information of each pixel point is calculated,
Figure 296295DEST_PATH_IMAGE059
the luminance information of the first pixel point.
The HIS difference is adjusted by a plurality of constants in consideration of the difference in the range of hue, saturation, and brightness. Since hue information and saturation information need to be analyzed together in the HIS space, it is necessary to perform a joint analysis
Figure 362471DEST_PATH_IMAGE060
Can be expressed in terms of a difference in chromaticity,
Figure 117937DEST_PATH_IMAGE061
expressed as brightness variability. To be provided with
Figure 225439DEST_PATH_IMAGE062
And as the weight of the brightness difference, when the chromaticity difference is large, the data of the brightness difference is reduced, and the joint analysis of the three data is realized.
And obtaining a gray gradient amplitude and a gray gradient direction according to the transverse gray gradient and the longitudinal gray gradient of each pixel point, and taking the gray gradient amplitude and the gray gradient direction as gray gradient information. And obtaining an HIS gradient amplitude and an HIS gradient direction according to the transverse HIS gradient and the longitudinal HIS gradient of each pixel point, and taking the HIS gradient amplitude and the HIS gradient direction as HIS gradient information. The gray scale gradient information is the same as the acquisition method of the HIS gradient information, and only the gray scale gradient information is taken as an example:
Figure 650735DEST_PATH_IMAGE063
Figure 499481DEST_PATH_IMAGE064
wherein the content of the first and second substances,
Figure 74950DEST_PATH_IMAGE065
in order to be the gray scale gradient magnitude,
Figure 161592DEST_PATH_IMAGE003
in order to be a lateral gray scale gradient,
Figure 148003DEST_PATH_IMAGE066
in order to be a longitudinal gray scale gradient,
Figure 595295DEST_PATH_IMAGE067
is the gray scale gradient direction.
The obtaining of the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information specifically includes:
obtaining the retention degree of the edge information according to an edge information retention degree formula, wherein the edge information retention degree formula comprises:
Figure 223723DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure 696030DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 869654DEST_PATH_IMAGE027
in order to color print the size of the package image,
Figure 53510DEST_PATH_IMAGE028
is a coordinate of
Figure 734896DEST_PATH_IMAGE029
The HIS gradient magnitude of the pixel point of (a),
Figure 422230DEST_PATH_IMAGE030
is a coordinate of
Figure 32334DEST_PATH_IMAGE029
The gray scale gradient amplitude of the pixel point of (1),
Figure 15071DEST_PATH_IMAGE068
is a coordinate of
Figure 250880DEST_PATH_IMAGE029
The direction of the HIS gradient of the pixel point of (a),
Figure 74611DEST_PATH_IMAGE032
is a coordinate of
Figure 839305DEST_PATH_IMAGE029
The gray gradient direction of the pixel point.
Since a large amount of edge information caused by color difference exists in the HIS image, the smaller the difference between the grayscale gradient information and the HIS gradient information is, the more edge information in the current grayscale image is, i.e. the greater the retention degree of edge information is.
Step S2: extracting a plurality of first feature descriptors in the gray level image according to an SIFT algorithm; obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions; and obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors.
In the feature matching process, the Euclidean distance of the features of the two image key points is used for judgment, so that the stronger the feature information in one key point is, the larger the difference between the key point and other unmatched key points in the subsequent matching process is, the faster the matching speed is, and the more accurate the result is.
And extracting a plurality of first feature descriptors in the gray-scale image according to a SIFT algorithm. It should be noted that the SIFT algorithm is well known in the prior art, and the process thereof is only briefly described herein, and is not described in detail, and specifically includes:
(1) And generating a Gaussian difference pyramid and finishing the construction of the scale space.
(2) And the preliminary detection of key points in the gray image is realized through the detection of the spatial extreme points.
(3) And removing noise points in the primary probing process, and realizing the accurate positioning of the stable key points.
(4) And calculating 4-by-4 block gradient direction histograms around the key point, counting gradient amplitudes in 8 directions of each histogram, and acquiring a 128-dimensional feature descriptor of the key point as a first feature descriptor. That is, a key point corresponds to a first feature descriptor, and a first feature descriptor has 128-dimensional data.
Obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions, specifically comprising:
and acquiring a plurality of eigenvalues of the covariance matrix of the first characteristic descriptor, and arranging the eigenvalues from large to small to obtain an eigenvalue sequence. And a plurality of characteristic values in front of the characteristic value sequence are used as reference characteristic values according to a preset selection quantity.
And acquiring a reference feature vector corresponding to the reference feature value, and multiplying the first feature descriptor by the reference feature vector to obtain a plurality of second feature descriptors. Each reference feature vector corresponds to a principal component direction, and a larger second feature descriptor means that the first feature descriptor produces a larger divergence in the corresponding principal component direction, meaning that more information is retained.
Obtaining the retention degree of the feature information corresponding to the first feature descriptor according to the size of the second feature descriptor, specifically comprising:
obtaining the retention degree of the feature information according to a feature information retention degree formula, wherein the feature information retention degree formula comprises:
Figure 247021DEST_PATH_IMAGE069
wherein the content of the first and second substances,
Figure 20942DEST_PATH_IMAGE034
in order to preserve the degree of the characteristic information,
Figure 964758DEST_PATH_IMAGE035
is a natural constant and is a natural constant,
Figure 431512DEST_PATH_IMAGE036
in order to select the number of the components,
Figure 326524DEST_PATH_IMAGE037
is as follows
Figure 638557DEST_PATH_IMAGE038
A second set of characteristics describing the characteristics of the device,
Figure 702459DEST_PATH_IMAGE039
is as follows
Figure 120540DEST_PATH_IMAGE038
The feature value corresponding to each second feature descriptor,
Figure 66630DEST_PATH_IMAGE070
in order to be the number of the characteristic values,
Figure 916775DEST_PATH_IMAGE040
is as follows
Figure 357156DEST_PATH_IMAGE041
A characteristic value. In the embodiment of the present invention, the number of choices is set to 30, i.e., the number of choices is set to 30
Figure 900133DEST_PATH_IMAGE071
Since the first feature descriptor is 128-dimensional data, it is therefore
Figure 5623DEST_PATH_IMAGE072
In the formula of the retention degree of the feature information, the feature value ratio corresponding to each second feature descriptor is used as a weight, and the features of the second feature descriptors are amplified, namely the greater the feature value is, the greater the second feature descriptor is, the greater the retention degree of the feature information is.
And analyzing the retention degree of the feature information of the first feature descriptor of each key point in the gray level image, accumulating the retention degrees of the feature information, and obtaining the overall retention degree of the feature information of the current gray level image in the feature point analysis.
And step S3: obtaining the recommendation degree of the gray level image according to the edge information retention degree and the feature information retention degree; and changing parameters in the gray scale conversion formula to obtain a plurality of recommendation degrees, and taking the gray scale conversion formula corresponding to the maximum recommendation degree as a standard gray scale conversion formula.
Jointly analyzing the retention degree of the edge information and the retention degree of the feature information to obtain the recommendation degree of the current gray level image, specifically comprising the following steps:
obtaining the recommendation degree according to a recommendation degree formula, wherein the recommendation degree formula comprises the following steps:
Figure 925037DEST_PATH_IMAGE073
wherein, T is the recommendation degree,
Figure 930908DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 910366DEST_PATH_IMAGE044
is as follows
Figure 768731DEST_PATH_IMAGE045
The degree of retention of feature information of the first feature descriptor,
Figure 226258DEST_PATH_IMAGE046
is the number of first feature descriptors.
In the recommendation degree formula, to
Figure 86635DEST_PATH_IMAGE074
As the weight of the overall feature information retention degree, when the edge information retention degree is less than 0.5, the weight is less than 1, and the overall feature information retention degree is affected, so that the edge information retention degree is more focused in the recommendation degree analysis.
By changing the parameters in the gray level conversion formula, the recommendation degree corresponding to each parameter combination can be obtained. Because the color printing packaging products in the whole production batch are the same product, namely correspond to the same color printing template, the detection of the products corresponding to the color printing template can be realized only by obtaining a group of optimal parameter combinations. Therefore, a large number of parameter combinations can be selected, and the combination corresponding to the maximum recommendation degree is selected from the parameter combinations to obtain the standard gray scale conversion formula. It should be noted that the number of parameter combinations may be specifically set according to a specific production environment, and is not limited herein.
And step S4: acquiring a color printing package image to be detected; obtaining a gray level image to be detected of the color printing packaging image to be detected according to a standard gray level conversion formula; obtaining a template gray level image of the template color printing packaging image according to a standard gray level conversion formula; and acquiring and matching the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm, and identifying the color printing packaging image to be detected according to a matching result.
So far, the standard gray scale conversion formula corresponding to the color printing product can be obtained by only collecting one color printing package image. For the color printing packaging product to be detected, the gray level image to be detected of the color printing packaging image to be detected can be obtained by directly utilizing a standard gray level conversion formula. And similarly, obtaining a template gray level image of the template color printing packaging image by using a standard gray level conversion formula. And acquiring the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm and performing template matching. If the matching result is successful, the current color printing packaging product to be detected is successfully identified, and the product is qualified; if the matching fails or the matching results are different, the defect of the current color printing packaging product to be detected is indicated, and the product is unqualified and needs further judgment or discarding by a worker.
It should be noted that the specific qualified matching result definition standard may be set according to the required precision of the production environment, and is not limited herein.
In summary, in the embodiments of the present invention, the edge information retention of the current grayscale image is obtained through the difference between the gradient information in the HIS image and the gradient information in the grayscale image. And analyzing the first feature descriptors of the key points in the gray-scale image to obtain the feature information retention degree of each first feature descriptor. And obtaining recommendation degree according to the retention degree of the edge information and the retention degree of the feature information. And further selecting an optimal standard gray scale conversion formula for processing the color printing packaging image to be detected and the template color printing packaging image, and realizing efficient matching identification. The embodiment of the invention optimizes the detection algorithm of the sampling package by the artificial intelligence optimization operation system under the artificial intelligence system in the production field, increases the identification detection efficiency and the identification detection precision of the color printing package product, and realizes the identification detection of the mass products under the color printing template by using a standard gray scale conversion formula and computer vision software.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. The processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
All the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (3)

1. A method of identifying a color printed package using an electronic device, the method comprising:
obtaining a color printing package image; converting the sampling package image into an HIS color space to obtain an HIS image; obtaining a gray level image of the color printing packaging image according to a preset gray level conversion formula; obtaining gray gradient information in the gray image; obtaining HIS gradient information according to the difference of color differences, the difference of saturation and the difference of brightness among pixel points in the HIS image; obtaining the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information;
extracting a plurality of first feature descriptors in the gray image according to an SIFT algorithm; obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions; obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors;
obtaining the recommendation degree of the gray level image according to the edge information retention degree and the feature information retention degree; changing parameters in the gray scale conversion formula to obtain a plurality of recommendation degrees, and taking the gray scale conversion formula corresponding to the maximum recommendation degree as a standard gray scale conversion formula;
acquiring a color printing package image to be detected; obtaining a to-be-detected gray level image of the to-be-detected color printing packaging image according to the standard gray level conversion formula; obtaining a template gray level image of the template color printing packaging image according to the standard gray level conversion formula; acquiring and matching feature points of the gray level image to be detected and the template gray level image according to an SIFT algorithm, and identifying the color printing packaging image to be detected according to a matching result;
the obtaining of the edge information retention of the grayscale image according to the difference between the HIS gradient information and the grayscale gradient information includes:
obtaining the edge information retention degree according to an edge information retention degree formula, wherein the edge information retention degree formula comprises:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE002
for the degree of retention of the edge information,
Figure DEST_PATH_IMAGE003
the size of the color-printed package image,
Figure DEST_PATH_IMAGE004
as a coordinate of
Figure DEST_PATH_IMAGE005
The HIS gradient magnitudes of the pixel points of (1),
Figure DEST_PATH_IMAGE006
is a coordinate of
Figure 982891DEST_PATH_IMAGE005
The gray scale gradient magnitude of the pixel point of (a),
Figure DEST_PATH_IMAGE007
is a coordinate of
Figure 165611DEST_PATH_IMAGE005
The direction of the HIS gradient of the pixel point of (a),
Figure DEST_PATH_IMAGE008
is a coordinate of
Figure 210927DEST_PATH_IMAGE005
The gray scale gradient direction of the pixel point of (a);
the obtaining a plurality of second feature descriptors from projections of each of the first feature descriptors in a plurality of principal component directions comprises:
acquiring a plurality of eigenvalues of a covariance matrix of the first characteristic descriptor, and arranging the eigenvalues from large to small to obtain an eigenvalue sequence; according to a preset selection quantity, a plurality of front characteristic values in the characteristic value sequence are used as reference characteristic values;
acquiring a reference feature vector corresponding to a reference feature value, and multiplying the first feature descriptor by the reference feature vector to obtain a plurality of second feature descriptors;
the obtaining the feature information retention of the corresponding first feature descriptor according to the plurality of second feature descriptors comprises
Obtaining the feature information retention degree according to a feature information retention degree formula, wherein the feature information retention degree formula comprises:
Figure DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE010
for the degree of retention of the characteristic information,
Figure DEST_PATH_IMAGE011
is a natural constant and is a natural constant,
Figure DEST_PATH_IMAGE012
for the purpose of said selected number of bits,
Figure DEST_PATH_IMAGE013
is as follows
Figure DEST_PATH_IMAGE014
A second set of said second feature descriptors,
Figure DEST_PATH_IMAGE015
is a first
Figure 448879DEST_PATH_IMAGE014
The feature values corresponding to the second feature descriptors,
Figure DEST_PATH_IMAGE016
is the number of the characteristic values that are present,
Figure DEST_PATH_IMAGE017
is as follows
Figure DEST_PATH_IMAGE018
-each of said characteristic values;
the obtaining of the recommendation degree of the gray-scale image according to the retention degree of the edge information and the retention degree of the feature information includes:
obtaining the recommendation degree according to a recommendation degree formula, wherein the recommendation degree formula comprises:
Figure DEST_PATH_IMAGE019
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE021
in order to be the degree of recommendation,
Figure DEST_PATH_IMAGE023
for the degree of retention of the edge information,
Figure DEST_PATH_IMAGE025
is a first
Figure DEST_PATH_IMAGE027
The feature information retention of each of the first feature descriptors,
Figure DEST_PATH_IMAGE029
is the number of said first feature descriptors.
2. The method of claim 1, wherein obtaining gray scale gradient information in the gray scale image comprises:
calculating the transverse gray gradient and the longitudinal gray gradient of each pixel point in the gray image according to a Soble operator, and specifically comprising the following steps:
Figure DEST_PATH_IMAGE030
Figure DEST_PATH_IMAGE031
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE032
is a pixel point
Figure DEST_PATH_IMAGE033
Is determined by the lateral gray scale gradient of (a),
Figure DEST_PATH_IMAGE034
is a pixel point
Figure 669776DEST_PATH_IMAGE033
The longitudinal gray scale gradient is set to be,
Figure DEST_PATH_IMAGE035
Figure DEST_PATH_IMAGE036
Figure DEST_PATH_IMAGE037
Figure DEST_PATH_IMAGE038
Figure DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE041
and
Figure DEST_PATH_IMAGE042
is a pixel point
Figure 334982DEST_PATH_IMAGE033
Eight neighborhood pixel values of;
and obtaining a gray gradient amplitude and a gray gradient direction according to the transverse gray gradient and the longitudinal gray gradient of each pixel point, and taking the gray gradient amplitude and the gray gradient direction as gray gradient information.
3. The method of claim 2, wherein the obtaining of the HIS gradient information according to the difference in color, the difference in saturation, and the difference in brightness between the pixels in the HIS image comprises:
replacing the difference of pixel values in the transverse gray gradient and the longitudinal gray gradient with HIS difference to obtain transverse HIS difference and longitudinal HIS difference; the HIS differences include:
Figure DEST_PATH_IMAGE043
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE044
is as follows
Figure DEST_PATH_IMAGE045
Pixel point and the second
Figure DEST_PATH_IMAGE046
The HIS differences of the individual pixels are,
Figure DEST_PATH_IMAGE047
is a first
Figure 398753DEST_PATH_IMAGE045
The hue information of each pixel point is obtained,
Figure DEST_PATH_IMAGE048
is as follows
Figure 4178DEST_PATH_IMAGE046
The hue information of each pixel point is obtained,
Figure DEST_PATH_IMAGE049
is as follows
Figure 938636DEST_PATH_IMAGE045
The information of the saturation of each pixel point,
Figure DEST_PATH_IMAGE050
is as follows
Figure 197579DEST_PATH_IMAGE046
The saturation information for each pixel point is determined,
Figure DEST_PATH_IMAGE051
is a first
Figure 92853DEST_PATH_IMAGE045
The brightness information of each pixel point is obtained,
Figure DEST_PATH_IMAGE052
is a first
Figure 361024DEST_PATH_IMAGE046
The brightness information of the individual pixels;
and obtaining an HIS gradient amplitude and an HIS gradient direction according to the transverse HIS gradient and the longitudinal HIS gradient of each pixel point, and taking the HIS gradient amplitude and the HIS gradient direction as HIS gradient information.
CN202210980250.0A 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment Active CN115063605B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210980250.0A CN115063605B (en) 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210980250.0A CN115063605B (en) 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment

Publications (2)

Publication Number Publication Date
CN115063605A CN115063605A (en) 2022-09-16
CN115063605B true CN115063605B (en) 2022-12-13

Family

ID=83207635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210980250.0A Active CN115063605B (en) 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment

Country Status (1)

Country Link
CN (1) CN115063605B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596197B (en) * 2018-05-15 2020-08-25 汉王科技股份有限公司 Seal matching method and device
US11092966B2 (en) * 2018-12-14 2021-08-17 The Boeing Company Building an artificial-intelligence system for an autonomous vehicle
CN114494265B (en) * 2022-04-19 2022-06-17 南通宝田包装科技有限公司 Method for identifying packaging printing quality in cosmetic production field and artificial intelligence system
CN114648594B (en) * 2022-05-19 2022-08-16 南通恒强家纺有限公司 Textile color detection method and system based on image recognition

Also Published As

Publication number Publication date
CN115063605A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
CN101770582B (en) Image matching system and method
CN103424409B (en) Vision detecting system based on DSP
US9721532B2 (en) Color chart detection apparatus, color chart detection method, and color chart detection computer program
CN103914708B (en) Food kind detection method based on machine vision and system
US8995709B2 (en) Method for calculating weight ratio by quality grade in grain appearance quality grade discrimination device
US8879849B2 (en) System and method for digital image signal compression using intrinsic images
CN114648594B (en) Textile color detection method and system based on image recognition
CN109753945B (en) Target subject identification method and device, storage medium and electronic equipment
CN108564631A (en) Car light light guide acetes chinensis method, apparatus and computer readable storage medium
Patki et al. Cotton leaf disease detection & classification using multi SVM
Banić et al. Using the red chromaticity for illumination estimation
CN116681979A (en) Power equipment target detection method under complex environment
CN109284759A (en) One kind being based on the magic square color identification method of support vector machines (svm)
CN112819017B (en) High-precision color cast image identification method based on histogram
CN115063605B (en) Method for identifying color printing package by using electronic equipment
US8437545B1 (en) System and method for digital image signal compression using intrinsic images
CN210377552U (en) Fruit is multiaspect image acquisition device for classification
CN110197178A (en) A kind of rice type of TuPu method fusion depth network quickly identifies detection device and its detection method
CN114486895A (en) Method for detecting sample substance concentration based on pushing of urine drying test strip to identification
CN113052176A (en) Character recognition model training method, device and system
KR101158329B1 (en) Apparatus and Method for Extracting Fluorescence Pattern for Automatic Paper Money Inspection
Li et al. A new color-to-gray conversion method based on edge detection
CN111325209A (en) License plate recognition method and system
CN116740550B (en) Refrigerator image integrity recognition method and system
CN110852995B (en) Discrimination method of robot sorting system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant