CN115063605A - Method for identifying color printing package by using electronic equipment - Google Patents

Method for identifying color printing package by using electronic equipment Download PDF

Info

Publication number
CN115063605A
CN115063605A CN202210980250.0A CN202210980250A CN115063605A CN 115063605 A CN115063605 A CN 115063605A CN 202210980250 A CN202210980250 A CN 202210980250A CN 115063605 A CN115063605 A CN 115063605A
Authority
CN
China
Prior art keywords
information
gray
gradient
image
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210980250.0A
Other languages
Chinese (zh)
Other versions
CN115063605B (en
Inventor
王卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Zhuoyue Digital Technology Co ltd
Original Assignee
Nantong Zhuoyue Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Zhuoyue Digital Technology Co ltd filed Critical Nantong Zhuoyue Digital Technology Co ltd
Priority to CN202210980250.0A priority Critical patent/CN115063605B/en
Publication of CN115063605A publication Critical patent/CN115063605A/en
Application granted granted Critical
Publication of CN115063605B publication Critical patent/CN115063605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a method for identifying color printing packages by utilizing electronic equipment. The method obtains the retention degree of the edge information of the current gray level image through the difference of the gradient information in the HIS image and the gray level image. And analyzing the first feature descriptors of the key points in the gray-scale image to obtain the feature information retention degree of each first feature descriptor. And obtaining recommendation degree according to the marginal information retention degree and the characteristic information retention degree. And further selecting an optimal standard gray scale conversion formula for processing the color printing packaging image to be detected and the template color printing packaging image, and realizing efficient matching identification. The embodiment of the invention optimizes the detection algorithm of the sampling package by the artificial intelligence optimization operation system under the artificial intelligence system in the production field, increases the identification detection efficiency and the identification detection precision of the color printing package product, and realizes the identification detection of the mass products under the color printing template by using a standard gray scale conversion formula and computer vision software.

Description

Method for identifying color printing package by using electronic equipment
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method for identifying color printing packages by utilizing electronic equipment.
Background
With the development of the technology in the production field, people have higher requirements on the quality of the produced products. The packaging of the product is an important component of the product, and the quality of the product is also important for detection in the production field. The color printing package directly influences the impression and purchasing power of a consumer on the product, and after the color printing template is determined, the color printing package product is ensured to be matched with the color printing template in the production process of the package.
In the color printing packaging quality detection, the manual detection has low precision and low efficiency, and is not suitable for the development of the automatic production field. Therefore, in the field of automatic production, it is necessary to extract the features of the color printing package image by computer vision technology and match the features with the color printing template. A commonly used matching algorithm is a feature detection algorithm (SIFT algorithm) for interest points, and the SIFT algorithm performs matching according to feature information of key points by searching the key points in different scale spaces. The color-rich SIFT algorithm in the color printing package has large calculation amount, so that the color printing package image can be converted into a gray image, the calculation load of the SIFT algorithm is simplified, but the gray image can cause the loss of edge information of a color image and influence the subsequent matching result.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a method for identifying a color-printed package by using an electronic device, which adopts the following technical solutions:
the invention provides a method for identifying color printing packages by using electronic equipment, which comprises the following steps:
obtaining a color printing package image; converting the sampling package image into an HIS color space to obtain an HIS image; obtaining a gray level image of the color printing packaging image according to a preset gray level conversion formula; obtaining gray gradient information in the gray image; acquiring HIS gradient information according to the difference of color difference, the difference of saturation and the difference of brightness among the pixel points in the HIS image; obtaining the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information;
extracting a plurality of first feature descriptors in the gray image according to an SIFT algorithm; obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions; obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors;
obtaining the recommendation degree of the gray level image according to the retention degree of the edge information and the retention degree of the feature information; changing parameters in the gray scale conversion formula to obtain a plurality of recommendation degrees, and taking the gray scale conversion formula corresponding to the maximum recommendation degree as a standard gray scale conversion formula;
acquiring a color printing package image to be detected; obtaining a to-be-detected gray level image of the to-be-detected color printing packaging image according to the standard gray level conversion formula; obtaining a template gray level image of the template color printing packaging image according to the standard gray level conversion formula; and acquiring and matching the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm, and identifying the color printing packaging image to be detected according to a matching result.
Further, the obtaining gray gradient information in the gray image comprises:
calculating the transverse gray gradient and the longitudinal gray gradient of each pixel point in the gray image according to a Soble operator, and specifically comprising the following steps:
Figure 374729DEST_PATH_IMAGE001
Figure 886613DEST_PATH_IMAGE002
wherein,
Figure 570535DEST_PATH_IMAGE003
is a pixel point
Figure 440140DEST_PATH_IMAGE004
Is determined by the lateral gray scale gradient of (a),
Figure 93975DEST_PATH_IMAGE005
is a pixel point
Figure 866890DEST_PATH_IMAGE004
The longitudinal gray scale gradient is set to be,
Figure 846347DEST_PATH_IMAGE006
Figure 937669DEST_PATH_IMAGE007
Figure 395195DEST_PATH_IMAGE008
Figure 22617DEST_PATH_IMAGE009
Figure 907396DEST_PATH_IMAGE010
Figure 486014DEST_PATH_IMAGE011
Figure 481652DEST_PATH_IMAGE012
and
Figure 822634DEST_PATH_IMAGE013
is a pixel point
Figure 127583DEST_PATH_IMAGE004
Eight neighborhood pixel values of;
and obtaining a gray gradient amplitude and a gray gradient direction according to the transverse gray gradient and the longitudinal gray gradient of each pixel point, and taking the gray gradient amplitude and the gray gradient direction as gray gradient information.
Further, the obtaining of the HIS gradient information according to the difference in color difference, the difference in saturation, and the difference in brightness between the pixels in the HIS image includes:
replacing the difference of pixel values in the transverse gray gradient and the longitudinal gray gradient with HIS difference to obtain transverse HIS difference and longitudinal HIS difference; the HIS differences include:
Figure 944229DEST_PATH_IMAGE014
wherein,
Figure 494290DEST_PATH_IMAGE015
is as follows
Figure 814413DEST_PATH_IMAGE016
Pixel point and the second
Figure 290263DEST_PATH_IMAGE017
The HIS differences of the individual pixels are,
Figure 594205DEST_PATH_IMAGE018
is as follows
Figure 947957DEST_PATH_IMAGE016
The hue information of each pixel point is calculated,
Figure 653745DEST_PATH_IMAGE019
is as follows
Figure 566075DEST_PATH_IMAGE017
The hue information of each pixel point is obtained,
Figure 91734DEST_PATH_IMAGE020
is as follows
Figure 249177DEST_PATH_IMAGE016
The information of the saturation of each pixel point,
Figure 543892DEST_PATH_IMAGE021
is as follows
Figure 158282DEST_PATH_IMAGE017
The saturation information for each pixel point is determined,
Figure 249866DEST_PATH_IMAGE022
is as follows
Figure 795290DEST_PATH_IMAGE016
The brightness information of each pixel point is obtained,
Figure 695244DEST_PATH_IMAGE024
is as follows
Figure 542852DEST_PATH_IMAGE017
The brightness information of each pixel point;
and obtaining an HIS gradient amplitude and an HIS gradient direction according to the transverse HIS gradient and the longitudinal HIS gradient of each pixel point, and taking the HIS gradient amplitude and the HIS gradient direction as HIS gradient information.
Further, the obtaining of the retention degree of the edge information of the gray scale image according to the difference between the HIS gradient information and the gray scale gradient information includes:
obtaining the edge information retention degree according to an edge information retention degree formula, wherein the edge information retention degree formula comprises:
Figure 777524DEST_PATH_IMAGE025
wherein,
Figure 11190DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 123240DEST_PATH_IMAGE027
the size of the color-printed package image,
Figure 767848DEST_PATH_IMAGE028
is a coordinate of
Figure 506128DEST_PATH_IMAGE029
The HIS gradient magnitude of the pixel point of (a),
Figure 527174DEST_PATH_IMAGE030
is a coordinate of
Figure 165835DEST_PATH_IMAGE029
The gray scale gradient magnitude of the pixel point of (a),
Figure 246923DEST_PATH_IMAGE031
is as a coordinate of
Figure 472499DEST_PATH_IMAGE029
The direction of the HIS gradient of the pixel points of (a),
Figure 562815DEST_PATH_IMAGE032
is a coordinate of
Figure 55982DEST_PATH_IMAGE029
The gray scale gradient direction of the pixel point of (1).
Further, the obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions includes:
acquiring a plurality of eigenvalues of the covariance matrix of the first characteristic descriptor, and arranging the eigenvalues from large to small to obtain an eigenvalue sequence; according to a preset selection quantity, a plurality of front characteristic values in the characteristic value sequence are used as reference characteristic values;
and acquiring a reference feature vector corresponding to a reference feature value, and multiplying the first feature descriptor and the reference feature vector to obtain a plurality of second feature descriptors.
Further, the obtaining the feature information retention of the corresponding first feature descriptor according to the plurality of second feature descriptors includes
Obtaining the feature information retention degree according to a feature information retention degree formula, wherein the feature information retention degree formula comprises:
Figure 573551DEST_PATH_IMAGE033
wherein,
Figure 879899DEST_PATH_IMAGE034
for the degree of retention of the characteristic information,
Figure 757594DEST_PATH_IMAGE035
is a natural constant and is a natural constant,
Figure 856000DEST_PATH_IMAGE036
for the purpose of said selection of the number,
Figure 295202DEST_PATH_IMAGE037
is as follows
Figure 947901DEST_PATH_IMAGE038
A second set of said second feature descriptors,
Figure 878554DEST_PATH_IMAGE039
is as follows
Figure 972412DEST_PATH_IMAGE038
The feature values corresponding to the second feature descriptors, U is the number of the feature values,
Figure 940106DEST_PATH_IMAGE040
is as follows
Figure 96412DEST_PATH_IMAGE041
And (c) the characteristic value.
Further, the obtaining of the recommendation degree of the grayscale image according to the edge information retention degree and the feature information retention degree includes:
obtaining the recommendation degree according to a recommendation degree formula, wherein the recommendation degree formula comprises:
Figure 801063DEST_PATH_IMAGE042
wherein,
Figure 857749DEST_PATH_IMAGE043
in order to be the degree of recommendation,
Figure 232230DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 390679DEST_PATH_IMAGE044
is as follows
Figure 148288DEST_PATH_IMAGE045
A first characteristicThe degree of retention of said characteristic information of the descriptor,
Figure 75793DEST_PATH_IMAGE046
is the number of the first feature descriptors.
The invention has the following beneficial effects:
according to the embodiment of the invention, HIS gradient information and gray scale gradient information of the color printing packaging image are obtained at the same time, and the edge information retention degree of the gray scale image under the current gray scale conversion formula is obtained according to the difference of the two gradient information. And further extracting a first feature descriptor of the gray image by utilizing a SIFT algorithm, and obtaining the feature information retention of the first feature descriptor of the gray image under the current gray conversion formula through the projection information of the first feature descriptor in a plurality of principal component directions. And obtaining the recommendation degree of the gray level image under the current gray level conversion formula according to the edge information retention degree and the characteristic information retention degree. And selecting the parameters of the gray level conversion formula according to the recommendation degree, and finally obtaining a gray level image with clear edge information and clear feature information, so that the gray level image is convenient to match with the template gray level image. The embodiment of the invention optimizes the detection algorithm of the sampling package by the artificial intelligence optimization operation system under the artificial intelligence system in the production field, increases the identification detection efficiency and the identification detection precision of the color printing package product, and realizes the identification detection of the mass products under the color printing template by using a standard gray scale conversion formula and computer vision software.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for identifying a color-printed package by an electronic device according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the method for identifying color-printed packages by electronic devices according to the present invention with reference to the accompanying drawings and preferred embodiments shows the following detailed descriptions of the specific implementation, structure, features and effects thereof. In the following description, the different references to "one embodiment" or "another embodiment" do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the method for identifying color-printed packages by using electronic equipment in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of a method for identifying a color-printed package by an electronic device according to an embodiment of the present invention is shown, the method including:
step S1: obtaining a color printing package image; converting the sampling package image into an HIS color space to obtain an HIS image; obtaining a gray level image of the color printing package image according to a preset gray level conversion formula; obtaining gray gradient information in a gray image; acquiring HIS gradient information according to color difference, saturation difference and brightness difference among pixel points in the HIS image; and obtaining the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information.
In the embodiment of the invention, the product conveyor belt is arranged in the production scene of the color printing package, the produced color printing package product is placed on the conveyor belt, and the electronic equipment is arranged above the conveyor belt and comprises the detection device and the rack for fixing the detection device. The detection device comprises: an imaging device, such as an industrial camera, for imaging and imaging data output of the color-printed packaged product in the region of the conveyor belt; a light source device for performing brightness compensation on the imaging device; the sensing device is used for sensing the detected color printing package and outputting a sensing signal; and the central processing unit is used for receiving the induction signals, realizing corresponding data processing and is electrically connected with the imaging equipment and the induction device.
When the sensing device senses that the color-printed packaging products on the conveyor belt are conveyed to the lower part of the electronic equipment, the sensing signal is transmitted to the central processing unit, the central processing unit controls the light source equipment and the imaging equipment to acquire image data of the color-printed packaging products, and the imaging equipment transmits the image data to the central processing unit for corresponding data processing.
It should be noted that, in the production field, the color-printed packaging products are usually produced in large quantities, that is, the color-printed packaging products for quality inspection are all in a color-printed style, corresponding to a color-printed template.
And acquiring a color printing package image. The color printed package image is converted into the HIS color space. It should be noted that the color printing package image collected by the electronic device is usually an RGB image, and the conversion of the RGB image into the HIS color space is well known in the art and will not be described herein.
Obtaining a gray level image of the color printing packaging image according to a preset gray level conversion formula, wherein the gray level conversion formula comprises the following steps:
Figure 621175DEST_PATH_IMAGE047
wherein,
Figure 250608DEST_PATH_IMAGE048
is a gray value in a gray-scale image,
Figure 828220DEST_PATH_IMAGE049
for the channel value of the red channel in the color printed package image,
Figure 95384DEST_PATH_IMAGE050
for the channel value of the green channel in the color printed package image,
Figure 467460DEST_PATH_IMAGE051
for packing blue channels in images for colour printingThe value of the channel is used to determine,
Figure 849768DEST_PATH_IMAGE034
Figure 965492DEST_PATH_IMAGE052
and
Figure 352742DEST_PATH_IMAGE053
for the weight parameter of the corresponding channel, in the conventional gradation conversion,
Figure 895719DEST_PATH_IMAGE054
Figure 624378DEST_PATH_IMAGE055
Figure 763367DEST_PATH_IMAGE056
when a color printed packaging image is converted into a gray image, information in the gray image is incomplete and edge information is lost due to disappearance of hue information. The HIS image comprises information of hue, saturation and brightness, and the retention degree of the edge information of the current gray level image can be obtained by comparing the information in the HIS image with the information of the current gray level image.
In order to obtain the edge information retention degree, it is first necessary to acquire edge gradient information of the HIS image and the grayscale image. Preferably, calculating the horizontal gray gradient and the vertical gray gradient of each pixel point in the gray image by using a Soble operator, and specifically comprises:
Figure 519970DEST_PATH_IMAGE001
Figure 483116DEST_PATH_IMAGE057
wherein,
Figure 325170DEST_PATH_IMAGE003
is a pixel point
Figure 267849DEST_PATH_IMAGE004
The lateral gray scale gradient of (a) is,
Figure 252860DEST_PATH_IMAGE005
is a pixel point
Figure 278585DEST_PATH_IMAGE004
The longitudinal gradient of the gray scale is provided,
Figure 420985DEST_PATH_IMAGE006
Figure 682202DEST_PATH_IMAGE007
Figure 545157DEST_PATH_IMAGE008
Figure 351570DEST_PATH_IMAGE009
Figure 168217DEST_PATH_IMAGE010
Figure 108491DEST_PATH_IMAGE011
Figure 943460DEST_PATH_IMAGE012
and
Figure 435622DEST_PATH_IMAGE013
is a pixel point
Figure 490296DEST_PATH_IMAGE004
Eight neighborhood pixel values.
In the gray scale gradient information calculation process, the difference of pixel values of other pixel points in the neighborhood of a target pixel point is reflected in the transverse gray scale gradient and the longitudinal gray scale gradient, and when HIS gradient information is analyzed, the hue difference, the saturation difference and the brightness difference need to be considered at the same time, and the method specifically comprises the following steps:
replacing the pixel value difference in the transverse gray gradient and the longitudinal gray gradient with an HIS difference to obtain the transverse HIS difference and the longitudinal HIS difference, wherein the HIS difference comprises:
Figure 732797DEST_PATH_IMAGE058
wherein,
Figure 923738DEST_PATH_IMAGE015
is as follows
Figure 321221DEST_PATH_IMAGE016
Pixel point and the second
Figure 361727DEST_PATH_IMAGE017
The HIS difference of the individual pixel points,
Figure 502858DEST_PATH_IMAGE018
is as follows
Figure 79464DEST_PATH_IMAGE016
The hue information of each pixel point is obtained,
Figure 647849DEST_PATH_IMAGE019
is as follows
Figure 644493DEST_PATH_IMAGE017
The hue information of each pixel point is obtained,
Figure 589315DEST_PATH_IMAGE020
is as follows
Figure 613903DEST_PATH_IMAGE016
The information of the saturation of each pixel point,
Figure 602456DEST_PATH_IMAGE021
is as follows
Figure 102708DEST_PATH_IMAGE017
A pixelThe information of the degree of saturation of a point,
Figure 336374DEST_PATH_IMAGE022
is a first
Figure 871261DEST_PATH_IMAGE016
The luminance information of each pixel point is calculated,
Figure 296295DEST_PATH_IMAGE059
the luminance information of the first pixel point.
The HIS difference is adjusted by a plurality of constants in consideration of the difference in the range of hue, saturation, and brightness. Since hue information and saturation information need to be analyzed together in the HIS space, it is necessary to perform a joint analysis
Figure 362471DEST_PATH_IMAGE060
Can be expressed in terms of a difference in chromaticity,
Figure 117937DEST_PATH_IMAGE061
expressed as brightness variability. To be provided with
Figure 225439DEST_PATH_IMAGE062
And as the weight of the brightness difference, when the chromaticity difference is large, the data of the brightness difference is reduced, and the joint analysis of the three data is realized.
And obtaining a gray gradient amplitude and a gray gradient direction according to the transverse gray gradient and the longitudinal gray gradient of each pixel point, and taking the gray gradient amplitude and the gray gradient direction as gray gradient information. And obtaining an HIS gradient amplitude and an HIS gradient direction according to the transverse HIS gradient and the longitudinal HIS gradient of each pixel point, and taking the HIS gradient amplitude and the HIS gradient direction as HIS gradient information. The gray scale gradient information is the same as the acquisition method of the HIS gradient information, and only the gray scale gradient information is taken as an example:
Figure 650735DEST_PATH_IMAGE063
Figure 499481DEST_PATH_IMAGE064
wherein,
Figure 74950DEST_PATH_IMAGE065
in order to be the gray scale gradient magnitude,
Figure 161592DEST_PATH_IMAGE003
in order to be a lateral gray scale gradient,
Figure 148003DEST_PATH_IMAGE066
in order to be a longitudinal gray scale gradient,
Figure 595295DEST_PATH_IMAGE067
is the gray gradient direction.
The obtaining of the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information specifically includes:
obtaining the retention degree of the edge information according to an edge information retention degree formula, wherein the edge information retention degree formula comprises the following steps:
Figure 223723DEST_PATH_IMAGE025
wherein,
Figure 696030DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 869654DEST_PATH_IMAGE027
in order to color print the size of the package image,
Figure 53510DEST_PATH_IMAGE028
is a coordinate of
Figure 734896DEST_PATH_IMAGE029
The HIS gradient magnitude of the pixel point of (a),
Figure 422230DEST_PATH_IMAGE030
is a coordinate of
Figure 32334DEST_PATH_IMAGE029
The gray scale gradient amplitude of the pixel point of (1),
Figure 15071DEST_PATH_IMAGE068
is a coordinate of
Figure 250880DEST_PATH_IMAGE029
The direction of the HIS gradient of the pixel point of (a),
Figure 74611DEST_PATH_IMAGE032
is a coordinate of
Figure 839305DEST_PATH_IMAGE029
The gray gradient direction of the pixel point.
Because a great amount of edge information caused by color difference exists in the HIS image, the smaller the difference between the gray scale gradient information and the HIS gradient information is, the more the edge information in the current gray scale image is, i.e. the greater the retention degree of the edge information is.
Step S2: extracting a plurality of first feature descriptors in the gray level image according to an SIFT algorithm; obtaining a plurality of second feature descriptors according to projection of each first feature descriptor in a plurality of principal component directions; and obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors.
In the feature matching process, the judgment is carried out according to the Euclidean distance of the features of the key points of the two images, so that the stronger the feature information in one key point is, the larger the difference between the key point and other unmatched key points in the subsequent matching process is, the faster the matching speed is, and the more accurate the result is.
And extracting a plurality of first feature descriptors in the gray-scale image according to a SIFT algorithm. It should be noted that the SIFT algorithm is well known in the prior art, and the process thereof is only briefly described herein, and is not described in detail, and specifically includes:
(1) and generating a Gaussian difference pyramid and finishing the construction of the scale space.
(2) And the preliminary detection of key points in the gray image is realized through the detection of the spatial extreme points.
(3) And removing noise points in the primary probing process, and realizing accurate positioning of stable key points.
(4) And calculating 4-4 block gradient direction histograms around the key point, counting gradient amplitudes of 8 directions in each histogram, and acquiring a 128-dimensional feature descriptor of the key point as a first feature descriptor. That is, a key point corresponds to a first feature descriptor, and a first feature descriptor has 128-dimensional data.
Obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions, specifically comprising:
and acquiring a plurality of eigenvalues of the covariance matrix of the first characteristic descriptor, and arranging the eigenvalues from large to small to obtain an eigenvalue sequence. And a plurality of characteristic values in front of the characteristic value sequence are used as reference characteristic values according to a preset selection quantity.
And acquiring a reference feature vector corresponding to the reference feature value, and multiplying the first feature descriptor by the reference feature vector to obtain a plurality of second feature descriptors. Each reference feature vector corresponds to a principal component direction, and the larger the second feature descriptor is, the larger the divergence of the first feature descriptor in the corresponding principal component direction is, which means more information is retained.
Obtaining the feature information retention degree corresponding to the first feature descriptor according to the size of the second feature descriptor, specifically including:
obtaining the retention degree of the feature information according to a feature information retention degree formula, wherein the feature information retention degree formula comprises:
Figure 247021DEST_PATH_IMAGE069
wherein,
Figure 20942DEST_PATH_IMAGE034
in order to preserve the degree of the characteristic information,
Figure 964758DEST_PATH_IMAGE035
is a natural constant and is a natural constant,
Figure 431512DEST_PATH_IMAGE036
in order to select the number of the components,
Figure 326524DEST_PATH_IMAGE037
is as follows
Figure 638557DEST_PATH_IMAGE038
A second set of characteristics describing the characteristics of the device,
Figure 702459DEST_PATH_IMAGE039
is as follows
Figure 120540DEST_PATH_IMAGE038
The feature value corresponding to each second feature descriptor,
Figure 66630DEST_PATH_IMAGE070
in order to be able to determine the number of characteristic values,
Figure 916775DEST_PATH_IMAGE040
is a first
Figure 357156DEST_PATH_IMAGE041
And (4) the characteristic value. In the embodiment of the present invention, the number of choices is set to 30, i.e., the number of choices is set to 30
Figure 900133DEST_PATH_IMAGE071
Since the first feature descriptor is 128-dimensional data, it is therefore
Figure 5623DEST_PATH_IMAGE072
In the feature information retention formula, the feature value ratio corresponding to each second feature descriptor is used as a weight, and the features of the second feature descriptors are amplified, namely the greater the feature value is, the greater the second feature descriptor is, the greater the feature information retention is.
And analyzing the retention degree of the feature information of the first feature descriptor of each key point in the gray level image, accumulating the retention degrees of the feature information, and obtaining the overall retention degree of the feature information of the current gray level image in the feature point analysis.
Step S3: obtaining the recommendation degree of the gray level image according to the retention degree of the edge information and the retention degree of the feature information; and changing parameters in the gray scale conversion formula to obtain a plurality of recommendation degrees, and taking the gray scale conversion formula corresponding to the maximum recommendation degree as a standard gray scale conversion formula.
Jointly analyzing the retention degree of the edge information and the retention degree of the feature information to obtain the recommendation degree of the current gray level image, specifically comprising the following steps:
obtaining the recommendation degree according to a recommendation degree formula, wherein the recommendation degree formula comprises:
Figure 925037DEST_PATH_IMAGE073
wherein, T is the recommendation degree,
Figure 930908DEST_PATH_IMAGE026
for the degree of retention of the edge information,
Figure 910366DEST_PATH_IMAGE044
is as follows
Figure 768731DEST_PATH_IMAGE045
The degree of retention of feature information of the first feature descriptor,
Figure 226258DEST_PATH_IMAGE046
is the number of first feature descriptors.
In the recommendation degree formula, to
Figure 86635DEST_PATH_IMAGE074
As the weight of the retention degree of the global feature information, when the retention degree of the edge information is less than 0.5, the weight is less than 1, and the retention degree of the global feature information is affected, so that the retention degree of the edge information is more focused in the recommendation degree analysis.
By changing the parameters in the gray level conversion formula, the recommendation degree corresponding to each parameter combination can be obtained. Because the color printing packaging products in the whole production batch are the same product, namely correspond to the same color printing template, the detection of the products corresponding to the color printing template can be realized only by obtaining a group of optimal parameter combinations. Therefore, a large number of parameter combinations can be selected, and the combination corresponding to the maximum recommendation degree is selected from the parameter combinations to obtain the standard gray level conversion formula. It should be noted that the number of parameter combinations may be specifically set according to a specific production environment, and is not limited herein.
Step S4: acquiring a color printing package image to be detected; obtaining a gray level image to be detected of the color printing packaging image to be detected according to a standard gray level conversion formula; obtaining a template gray level image of the template color printing packaging image according to a standard gray level conversion formula; and acquiring and matching the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm, and identifying the color printing packaging image to be detected according to a matching result.
So far, the standard gray scale conversion formula corresponding to the color printing product can be obtained by only collecting one color printing package image. For the color printing packaging product to be detected, the gray level image to be detected of the color printing packaging image to be detected can be obtained by directly utilizing a standard gray level conversion formula. And obtaining a template gray level image of the template color printing packaging image by using a standard gray level conversion formula. And acquiring the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm and performing template matching. If the matching result is successful, the current color printing packaging product to be detected is successfully identified, and the product is qualified; if the matching fails or the matching results are different, the defect of the current color printing packaging product to be detected is indicated, and the product is unqualified and needs further judgment or discarding by a worker.
It should be noted that the specific qualified matching result definition standard may be set according to the required precision of the production environment, and is not limited herein.
In summary, in the embodiments of the present invention, the edge information retention of the current grayscale image is obtained through the difference between the gradient information in the HIS image and the gradient information in the grayscale image. And analyzing the first feature descriptors of the key points in the gray-scale image to obtain the feature information retention degree of each first feature descriptor. And obtaining recommendation degree according to the retention degree of the edge information and the retention degree of the feature information. And further selecting an optimal standard gray scale conversion formula for processing the color printing packaging image to be detected and the template color printing packaging image, and realizing efficient matching identification. The embodiment of the invention optimizes the detection algorithm of the sampling package by the artificial intelligence optimization operation system under the artificial intelligence system in the production field, increases the identification detection efficiency and the identification detection precision of the color printing package product, and realizes the identification detection of the mass products under the color printing template by using a standard gray scale conversion formula and computer vision software.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. The processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. A method of identifying a color printed package using an electronic device, the method comprising:
obtaining a color printing package image; converting the sampling package image into an HIS color space to obtain an HIS image; obtaining a gray level image of the color printing packaging image according to a preset gray level conversion formula; obtaining gray gradient information in the gray image; acquiring HIS gradient information according to the difference of color difference, the difference of saturation and the difference of brightness among the pixel points in the HIS image; obtaining the retention degree of the edge information of the gray level image according to the difference between the HIS gradient information and the gray level gradient information;
extracting a plurality of first feature descriptors in the gray image according to an SIFT algorithm; obtaining a plurality of second feature descriptors according to the projection of each first feature descriptor in a plurality of principal component directions; obtaining the feature information retention degree of the corresponding first feature descriptor according to the plurality of second feature descriptors;
obtaining the recommendation degree of the gray level image according to the retention degree of the edge information and the retention degree of the feature information; changing parameters in the gray scale conversion formula to obtain a plurality of recommendation degrees, and taking the gray scale conversion formula corresponding to the maximum recommendation degree as a standard gray scale conversion formula;
acquiring a color printing package image to be detected; obtaining a to-be-detected gray level image of the to-be-detected color printing packaging image according to the standard gray level conversion formula; obtaining a template gray level image of the template color printing packaging image according to the standard gray level conversion formula; and acquiring and matching the characteristic points of the gray image to be detected and the template gray image according to an SIFT algorithm, and identifying the color printing packaging image to be detected according to a matching result.
2. The method of claim 1, wherein obtaining gray scale gradient information in the gray scale image comprises:
calculating the transverse gray gradient and the longitudinal gray gradient of each pixel point in the gray image according to a Soble operator, and specifically comprising the following steps:
Figure 163653DEST_PATH_IMAGE001
Figure 848713DEST_PATH_IMAGE002
wherein,
Figure 809846DEST_PATH_IMAGE003
is a pixel point
Figure 395286DEST_PATH_IMAGE004
Is determined by the lateral gray scale gradient of (a),
Figure 9938DEST_PATH_IMAGE005
is a pixel point
Figure 385556DEST_PATH_IMAGE004
The longitudinal gray scale gradient is set to be,
Figure 180074DEST_PATH_IMAGE006
Figure 324748DEST_PATH_IMAGE007
Figure 110301DEST_PATH_IMAGE008
Figure 769953DEST_PATH_IMAGE009
Figure 860441DEST_PATH_IMAGE010
Figure 859621DEST_PATH_IMAGE011
Figure 612814DEST_PATH_IMAGE012
and
Figure 195979DEST_PATH_IMAGE013
is a pixel point
Figure 833765DEST_PATH_IMAGE004
Eight neighborhood pixel values of;
and obtaining a gray gradient amplitude and a gray gradient direction according to the transverse gray gradient and the longitudinal gray gradient of each pixel point, and taking the gray gradient amplitude and the gray gradient direction as gray gradient information.
3. The method of claim 2, wherein the obtaining of the HIS gradient information according to the difference in color, the difference in saturation, and the difference in brightness between the pixels in the HIS image comprises:
replacing the difference of pixel values in the transverse gray gradient and the longitudinal gray gradient with HIS difference to obtain transverse HIS difference and longitudinal HIS difference; the HIS differences include:
Figure 953031DEST_PATH_IMAGE014
wherein,
Figure 578922DEST_PATH_IMAGE015
is as follows
Figure 947587DEST_PATH_IMAGE016
Pixel point and the second
Figure 185801DEST_PATH_IMAGE017
The HIS differences of the individual pixels are,
Figure 595792DEST_PATH_IMAGE018
is as follows
Figure 894049DEST_PATH_IMAGE016
The hue information of each pixel point is obtained,
Figure 953272DEST_PATH_IMAGE019
is as follows
Figure 57494DEST_PATH_IMAGE017
The hue information of each pixel point is obtained,
Figure 56412DEST_PATH_IMAGE020
is as follows
Figure 791149DEST_PATH_IMAGE016
The information of the saturation of each pixel point,
Figure 72089DEST_PATH_IMAGE021
is as follows
Figure 416221DEST_PATH_IMAGE017
The saturation information for each pixel point is determined,
Figure 161323DEST_PATH_IMAGE022
is as follows
Figure 801383DEST_PATH_IMAGE016
The brightness information of each pixel point is obtained,
Figure 835198DEST_PATH_IMAGE023
is as follows
Figure 655124DEST_PATH_IMAGE017
The brightness information of each pixel point;
and obtaining an HIS gradient amplitude and an HIS gradient direction according to the transverse HIS gradient and the longitudinal HIS gradient of each pixel point, and taking the HIS gradient amplitude and the HIS gradient direction as HIS gradient information.
4. The method for identifying color-printed packages by using electronic equipment according to claim 3, wherein the obtaining the retention degree of the edge information of the gray-scale image according to the difference between the HIS gradient information and the gray-scale gradient information comprises:
obtaining the edge information retention degree according to an edge information retention degree formula, wherein the edge information retention degree formula comprises:
Figure 192416DEST_PATH_IMAGE024
wherein,
Figure 65694DEST_PATH_IMAGE025
for the degree of retention of the edge information,
Figure 819761DEST_PATH_IMAGE026
for the size of the color-printed package image,
Figure 944843DEST_PATH_IMAGE027
is a coordinate of
Figure 602220DEST_PATH_IMAGE028
The HIS gradient magnitude of the pixel point of (a),
Figure 82618DEST_PATH_IMAGE029
is a coordinate of
Figure 91025DEST_PATH_IMAGE028
The gray scale gradient magnitude of the pixel point of (a),
Figure 878853DEST_PATH_IMAGE030
is a coordinate of
Figure 390737DEST_PATH_IMAGE028
The direction of the HIS gradient of the pixel points of (a),
Figure 979719DEST_PATH_IMAGE031
is a coordinate of
Figure 209843DEST_PATH_IMAGE028
The gray scale gradient direction of the pixel point of (1).
5. The method of claim 1, wherein obtaining a plurality of second feature descriptors from projections of each of the first feature descriptors in a plurality of principal component directions comprises:
acquiring a plurality of eigenvalues of the covariance matrix of the first characteristic descriptor, and arranging the eigenvalues from large to small to obtain an eigenvalue sequence; according to a preset selection quantity, a plurality of front characteristic values in the characteristic value sequence are used as reference characteristic values;
and acquiring a reference feature vector corresponding to a reference feature value, and multiplying the first feature descriptor and the reference feature vector to obtain a plurality of second feature descriptors.
6. The method of claim 5, wherein said obtaining the retention of the feature information of the corresponding first feature descriptor according to the plurality of second feature descriptors comprises obtaining the retention of the feature information of the corresponding first feature descriptor
Obtaining the feature information retention degree according to a feature information retention degree formula, wherein the feature information retention degree formula comprises:
Figure 739044DEST_PATH_IMAGE032
wherein,
Figure 666287DEST_PATH_IMAGE033
for the degree of retention of said characteristic information,
Figure 255531DEST_PATH_IMAGE034
is a natural constant and is a natural constant,
Figure 645055DEST_PATH_IMAGE035
for the purpose of said selected number of bits,
Figure 476483DEST_PATH_IMAGE036
is as follows
Figure 962959DEST_PATH_IMAGE037
A second set of said second feature descriptors,
Figure 519842DEST_PATH_IMAGE038
is as follows
Figure 458979DEST_PATH_IMAGE037
The feature values corresponding to the second feature descriptors,
Figure 766201DEST_PATH_IMAGE039
is the number of the characteristic values that are,
Figure 372763DEST_PATH_IMAGE040
is as follows
Figure 834969DEST_PATH_IMAGE041
And (c) the characteristic value.
7. The method of claim 1, wherein obtaining the recommendation of the grayscale image according to the retention of the edge information and the retention of the feature information comprises:
obtaining the recommendation degree according to a recommendation degree formula, wherein the recommendation degree formula comprises:
Figure 25516DEST_PATH_IMAGE042
wherein,
Figure 106736DEST_PATH_IMAGE044
in order to be the degree of recommendation,
Figure 567804DEST_PATH_IMAGE046
for the degree of retention of the edge information,
Figure 168288DEST_PATH_IMAGE048
is a first
Figure 347596DEST_PATH_IMAGE050
The feature information retention of each of the first feature descriptors,
Figure 357141DEST_PATH_IMAGE052
is the number of the first feature descriptors.
CN202210980250.0A 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment Active CN115063605B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210980250.0A CN115063605B (en) 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210980250.0A CN115063605B (en) 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment

Publications (2)

Publication Number Publication Date
CN115063605A true CN115063605A (en) 2022-09-16
CN115063605B CN115063605B (en) 2022-12-13

Family

ID=83207635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210980250.0A Active CN115063605B (en) 2022-08-16 2022-08-16 Method for identifying color printing package by using electronic equipment

Country Status (1)

Country Link
CN (1) CN115063605B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596197A (en) * 2018-05-15 2018-09-28 汉王科技股份有限公司 A kind of seal matching process and device
US20200192389A1 (en) * 2018-12-14 2020-06-18 The Boeing Company Building an artificial-intelligence system for an autonomous vehicle
CN114494265A (en) * 2022-04-19 2022-05-13 南通宝田包装科技有限公司 Method for identifying packaging printing quality in cosmetic production field and artificial intelligence system
CN114648594A (en) * 2022-05-19 2022-06-21 南通恒强家纺有限公司 Textile color detection method and system based on image recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596197A (en) * 2018-05-15 2018-09-28 汉王科技股份有限公司 A kind of seal matching process and device
US20200192389A1 (en) * 2018-12-14 2020-06-18 The Boeing Company Building an artificial-intelligence system for an autonomous vehicle
CN114494265A (en) * 2022-04-19 2022-05-13 南通宝田包装科技有限公司 Method for identifying packaging printing quality in cosmetic production field and artificial intelligence system
CN114648594A (en) * 2022-05-19 2022-06-21 南通恒强家纺有限公司 Textile color detection method and system based on image recognition

Also Published As

Publication number Publication date
CN115063605B (en) 2022-12-13

Similar Documents

Publication Publication Date Title
US9721532B2 (en) Color chart detection apparatus, color chart detection method, and color chart detection computer program
CN103424409B (en) Vision detecting system based on DSP
CN112534243B (en) Inspection apparatus and method, and computer-readable non-transitory recording medium
CN103914708B (en) Food kind detection method based on machine vision and system
CN114648594B (en) Textile color detection method and system based on image recognition
US8879849B2 (en) System and method for digital image signal compression using intrinsic images
EP2638508B1 (en) System and method for identifying complex tokens in an image
CN108279238A (en) A kind of fruit maturity judgment method and device
CN112446864A (en) Flaw detection method, flaw detection device, flaw detection equipment and storage medium
Banić et al. Using the red chromaticity for illumination estimation
CN109284759A (en) One kind being based on the magic square color identification method of support vector machines (svm)
CN114926661B (en) Textile surface color data processing and identifying method and system
CN106951863A (en) A kind of substation equipment infrared image change detecting method based on random forest
CN115984210A (en) Vehicle real-time detection method and system for remote sensing multispectral image of unmanned aerial vehicle
CN112819017B (en) High-precision color cast image identification method based on histogram
CN115063605B (en) Method for identifying color printing package by using electronic equipment
CN113132693B (en) Color correction method
US8437545B1 (en) System and method for digital image signal compression using intrinsic images
CN210377552U (en) Fruit is multiaspect image acquisition device for classification
CN113340816B (en) Light source spectrum and multispectral reflectivity image acquisition method and device and electronic equipment
CN114486895A (en) Method for detecting sample substance concentration based on pushing of urine drying test strip to identification
KR20030091471A (en) YCrCb color based human face location detection method
CN111179256A (en) Method and device for determining authenticity of ceramic tile
CN111325209A (en) License plate recognition method and system
CN112926689B (en) Target positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant