CN114746864A - Method and apparatus for verifying authenticity of a product - Google Patents
Method and apparatus for verifying authenticity of a product Download PDFInfo
- Publication number
- CN114746864A CN114746864A CN201980102577.4A CN201980102577A CN114746864A CN 114746864 A CN114746864 A CN 114746864A CN 201980102577 A CN201980102577 A CN 201980102577A CN 114746864 A CN114746864 A CN 114746864A
- Authority
- CN
- China
- Prior art keywords
- product
- micro
- features
- image
- classifier
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 25
- 238000010801 machine learning Methods 0.000 claims abstract description 16
- VAYOSLLFUXYJDT-RDTXWAMCSA-N Lysergic acid diethylamide Chemical compound C1=CC(C=2[C@H](N(C)C[C@@H](C=2)C(=O)N(CC)CC)C2)=C3C2=CNC3=C1 VAYOSLLFUXYJDT-RDTXWAMCSA-N 0.000 claims abstract description 12
- 238000013527 convolutional neural network Methods 0.000 claims description 31
- 238000007639 printing Methods 0.000 claims description 24
- 239000013598 vector Substances 0.000 claims description 20
- 238000012795 verification Methods 0.000 claims description 15
- 238000012549 training Methods 0.000 claims description 11
- 238000000605 extraction Methods 0.000 claims description 10
- 238000005516 engineering process Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 9
- 238000004519 manufacturing process Methods 0.000 claims description 8
- 238000003860 storage Methods 0.000 claims description 2
- 239000010410 layer Substances 0.000 description 24
- 238000009826 distribution Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 15
- 238000011176 pooling Methods 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 4
- 238000012706 support-vector machine Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 210000002569 neuron Anatomy 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000009827 uniform distribution Methods 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000005315 distribution function Methods 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000002657 fibrous material Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003707 image sharpening Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000011229 interlayer Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/95—Pattern authentication; Markers therefor; Forgery detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/018—Certifying business or products
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/80—Recognising image objects characterised by unique random patterns
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Finance (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a method and a device for verifying the authenticity of a product, the product being provided with randomly distributed microdots on a product identifier, the method comprising: extracting micro-point features on the product identification from the image of the product identification of the verified product; extracting image features of at least a portion of the product identification from the image using a machine learning algorithm; and verifying the authenticity of the product identification of the verified product based on the extracted micro-point features and the image features. The method and the equipment provided by the embodiment of the invention can improve the accuracy of verifying the authenticity of the product.
Description
The present invention relates to a method and apparatus for verifying the authenticity of a product.
Counterfeit products cause significant losses to both the producer and the consumer and therefore need to be controlled by the use of secure and reliable anti-counterfeiting techniques. The existing anti-counterfeiting technologies for products comprise digital anti-counterfeiting technologies and texture anti-counterfeiting technologies.
Digital anti-counterfeiting technology utilizes bar codes or two-dimensional codes to give a unique Identification (ID) to products for anti-counterfeiting verification and traceability functions, is easy to copy and has poor safety.
The texture anti-counterfeiting technology uses randomly generated natural textures as anti-counterfeiting features, and the textures are physically irreproducible and have irreproducible features; however, the existing texture anti-counterfeiting technology lacks the automatic identification capability of anti-counterfeiting features, which requires visual identification of the anti-counterfeiting features, or relies on adding fiber materials in the production process to form the anti-counterfeiting features, thereby resulting in increased cost of anti-counterfeiting products and inconvenience in production.
At present, a new technology of combining a bar code or a two-dimensional code with printed micro-point characteristics is developed, so that the anti-counterfeiting performance of a product mark is further improved, and meanwhile, the production process of an anti-counterfeiting product can be simplified and the production cost can be reduced. However, in the process of verifying the product identifier, it is first necessary to acquire an image of the product identifier of the product to be verified, for example, a user is required to use a mobile phone or a digital camera to shoot the image of the product identifier, and due to differences in shooting functions of the mobile phone or the camera, shooting environments (e.g., light), and shooting levels (e.g., shooting angles, shooting distances, and camera stability), the image quality is affected to different degrees, so that a deviation occurs in the verification result, for example, the image of the product identifier as a genuine product is verified as a counterfeit product, or the image of the product identifier as a counterfeit product is verified as a genuine product.
Disclosure of Invention
In view of at least one of the above problems of the prior art, embodiments of the present invention provide a method and apparatus for verifying authenticity of a product, which can improve accuracy of verifying authenticity of the product.
An embodiment of the present invention provides a method for verifying authenticity of a product, where a product identifier of the product has randomly distributed micro-dots, the method including: extracting micro-point features on a product identifier of a verified product from an image of the product identifier; extracting image features of at least a portion of the product identification from the image using a machine learning algorithm; and verifying the authenticity of the product identification of the verified product based on the extracted micro-point feature and the image feature.
An embodiment of the present invention provides an apparatus for verifying authenticity of a product having randomly distributed micro-dots on a product identifier thereof, the apparatus comprising: the micro-point feature extraction module is used for extracting micro-point features on the product identification from the image of the product identification of the verified product; an image feature extraction module to extract image features of at least a portion of the product identification from the image using a machine learning algorithm; and the verification module is used for verifying the authenticity of the product identification of the verified product based on the extracted micro-point characteristic and the image characteristic.
An embodiment of the present invention provides an apparatus for verifying authenticity of a product having randomly distributed micro-dots on a product identifier thereof, the apparatus comprising: a memory for storing instructions; and a processor coupled to the memory, the instructions when executed by the processor causing the processor to perform a method according to the above embodiments.
Embodiments of the present invention also provide a computer-readable storage medium having stored thereon executable instructions that, when executed by a computer, cause the computer to perform the method of the above-described embodiments.
According to the scheme of the embodiment of the invention, when the authenticity of the product identification is verified, not only the micro-point characteristics on the product identification but also the image characteristics extracted from the image of the product identification through the machine learning algorithm are utilized, so that the accuracy of verifying the authenticity of the product identification can be improved.
Other features, characteristics, benefits and advantages of the present invention will become more apparent from the detailed description taken in conjunction with the following drawings, in which:
fig. 1 shows a flow chart of a method for verifying authenticity of a product according to a first embodiment of the invention;
FIG. 2 is a schematic illustration of embedding a micro-dot feature into a product two-dimensional code in an embodiment;
FIGS. 3(a) and 3(b) show images of probability density functions when a uniform distribution function is used as a random distribution function of micro-points, and distribution maps of micro-points sampled from the random distribution;
fig. 4 shows a flow chart of a method for verifying authenticity of a product according to a second embodiment of the invention;
5A-5C illustrate three example convolutional neural network structures used in a method for extracting image features of product identification in an embodiment of the present invention; and
fig. 6 is a block diagram showing the construction of an apparatus for verifying the authenticity of a product according to an embodiment of the present invention.
Embodiments of the present invention are further described below with reference to the accompanying drawings.
Fig. 1 shows a flow chart of a method 100 for verifying authenticity of a product according to a first embodiment of the invention. The method 100 shown in FIG. 1 may be implemented by any computing device having computing capabilities. The computing device may be, but is not limited to, a desktop computer, a laptop computer, a tablet computer, a server, a smart phone, or the like.
As shown in fig. 1, the verification method 100 includes: extracting a micro-point feature on a product identifier from an image of the product identifier of the verified product (step 101); extracting image features of at least a portion of the product identity from the image of the product identity using a machine learning algorithm (step 102); and verifying the authenticity of the product identification of the verified product based on the extracted micro-point features and the image features (step 103).
In an embodiment of the present invention, the verifying step 103 comprises: and verifying the authenticity of the product identification of the verified product by using a classifier trained by a machine learning algorithm.
In an embodiment of the present invention, the machine learning algorithm used to extract the image features is a convolutional neural network. The machine learning algorithm used for training the classifier is a machine learning algorithm that can classify feature vectors, such as a Support Vector Machine (SVM) or a boosted Tree (Boost Tree). The convolutional neural network is a feedforward neural network which comprises convolutional calculation and has a deep structure, has the characteristic learning capacity and can carry out translation invariant classification on input information according to a hierarchical structure. The convolutional neural network is constructed by imitating a visual perception mechanism of a living being, and can perform supervised learning and unsupervised learning, and the parameter sharing of convolution kernels in hidden layers and the sparsity of interlayer connection enable the convolutional neural network to learn lattice characteristics (such as pixels and audio) with small calculation amount.
In an embodiment of the present invention, the verification method may further include: the convolutional neural network and the classifier are trained by employing a plurality of genuine article identification images as positive samples and a plurality of counterfeit article identification images as negative samples.
In an embodiment of the invention, the step 102 of extracting image features comprises: and extracting image features from the image by using the trained convolutional neural network to output feature vectors describing the image features.
In an embodiment of the invention, the classifier comprises a first classifier, the extracted image features comprising at least printed features related to the printing of at least a part of the product identification; the first classifier distinguishes between authenticity of a product identification of the verified product based on the printed features. The first classifier may be trained with positive and negative examples of product identification. The verification step 103 may include: outputting, using the trained first classifier, a probability of being true of a product identification of the verified product based on the extracted printed features; and/or using the trained first classifier to output a probability that the product identification of the verified product is false based on the extracted printed features.
In an embodiment of the present invention, the printed feature of the product identifier of the genuine product is a feature associated with at least one of paper, ink, and printing equipment used in the printing process of the product identifier of the genuine product. The product identification printing can be to print digital files on physical paper or other bearing articles, and the same digital files are printed in different detail parts due to the complex combinations of different printer settings, different printer types, different ink or carbon powder or coloring agent, different paper characteristics and the like. This detail represents the printed features. For example, due to the paper, ink or printing equipment used, the printed lines may have subtle differences, such as subtle jagged portions with different shapes or arrangements at the edges. For example, a two-dimensional code in a product identification contains a plurality of black and white blocks, where all black and white demarcations can be different in different printing situations. In addition, the colors or shades of the print may also differ from one another as a result of paper, ink, or printing equipment. The printing difference is distributed in the printing area of the whole two-dimensional code. And the printed features of the product identification can be extracted using this difference. The product identification of the counterfeit article made by the duplication technique has printed features different from those of the genuine article. The convolutional neural network and the first classifier can be trained with a sufficient number of positive and negative samples. The convolutional neural network can learn the printing characteristics in the positive sample and the printing characteristics in the negative sample different from the positive sample in the training process; a trained convolutional neural network will have the ability to extract printed features in the verified product identification. The trained first classifier can compare the printed features contained in the image features extracted by the convolutional neural network with the printed features contained in the image features of the genuine product to output the authenticity probability of the product identification of the verified product.
In an embodiment of the present invention, the classifier may further include a second classifier, and the second classifier determines whether the product identifier is authentic based on the authenticity probability and the micro-point feature of the verified product identifier output by the first classifier. The second classifier may be trained with positive and negative examples of the product identification. The verifying step 103 may further comprise: comparing the extracted micro-point features with micro-point features of a product identifier pre-saved during or after the production of the product; forming a description vector about the product identification based on the comparison result and the authenticity probability of the product identification output by the first classifier; and using a second classifier to judge the authenticity of the product identification of the verified product based on the description vector.
In an embodiment of the invention, the description vector comprises data relating to at least one of: the matching rate between the extracted micro-point features and the pre-stored micro-point features, the statistical parameters of the pixel distances of the matched micro-points from the pre-stored micro-points in the image coordinate system, the number of micro-points which are not matched with the pre-stored micro-point features in the image of the product identifier of the verified product, and the image quality of the obtained product identifier.
In an embodiment of the present invention, the step 101 of extracting the micro-point feature may include: at least one of shape feature, position feature, gradation feature, and color feature of the micro-point is extracted from the image using an image processing technique.
In an embodiment of the present invention, the product identification may include at least one of a bar code and a two-dimensional graphic code.
Fig. 2 is a schematic diagram of embedding a micro-point feature into a product two-dimensional code in an embodiment, wherein the micro-point feature 202 is not shown in detail in the figure due to its small size. In the process of generating the micro-point features, a specific high-dimensional random distribution map 201 of the micro-points is first generated through an algorithm, and as a distribution characteristic of at least one of position distribution, gray scale distribution, color distribution and micro morphology of all the micro-point features, products in the same class or the same batch can commonly follow a certain distribution characteristic, wherein each product has other different micro-point features for distinguishing. For example, different random profiles may be used for different batches of product, and different microdots may be used for different products in the same batch. Then, the random distribution map of the micro-points is sampled by using the algorithm, a micro-point feature 202 with unique identification is generated for each product (or product identification or label), the generated micro-point feature is embedded into a digital two-dimensional identification 203 (such as a quick response matrix code, namely a two-dimensional code) of the product according to a preset avoidance rule, and the two-dimensional code embedded with the micro-point feature is printed on the surface of the product or the surface of a product package to serve as the product identification or the surface of a product label to form a digital product Identification (ID) with the micro-points. The avoidance rule may restrict at least one of a specific position distribution, a gradation distribution, and a color distribution of the micro-points. For example, the position distribution avoidance rule may ensure that black or dark-colored micro-points are generated only in the white modules of the two-dimensional code, and the grayscale distribution or color distribution avoidance rule may distribute to ensure that the grayscale or color of the micro-points satisfy certain grayscale and saturation limits without interfering with the white modules of the two-dimensional code. The avoidance rules work together to ensure that the reading of the two-dimensional code is not affected by the embedded micro-point characteristics, and the two-dimensional code still meets the corresponding national standard and/or international standard after the micro-point characteristics are added.
In some embodiments, the white micro-point feature 202 may also be embedded in a black module of the two-dimensional code 203, and the avoidance rule limits the micro-points to be generated only in the black module of the two-dimensional code, so that the two-dimensional code still meets the corresponding national standard and/or international standard after the micro-point feature is added. The white microdots maintain the highest contrast in the black module of the two-dimensional code and are produced by short pauses in the printing ink jet during printing.
The composition of the microdot features includes the most basic two-dimensional coordinates (X, Y) as a location feature, and may also include other optional features such as color, grayscale, shape, and the like. Typically, the irreproducibility and anti-counterfeiting performance of the micro-dot feature is achieved first by a random distribution of the two-dimensional positions of the micro-dots. And the color, gray scale or shape characteristics of the micro-points can be used for further improving the anti-counterfeiting performance of the product. Randomly distributed micro-point features may also form randomly distributed micro-point texture features.
After the product identifier is manufactured or in the manufacturing process, the micro-point characteristic information on the product identifier needs to be stored in a database for subsequent product authenticity verification. The stored information of the characteristics of the micro-points includes, for example, randomly distributed position characteristics, and other characteristics such as color, gray scale, or shape thereof.
As an example of the feature of the micro-point, fig. 3(a) and 3(b) show an image of a probability density function when a uniform distribution function is adopted as a random distribution function of the micro-point, and a distribution map of the micro-point sampled from the random distribution. Wherein the probability density function of the uniform distribution function is:
PDF(x,y)=const
in the image of the probability density function of fig. 3(a), the Z-direction coordinate is the probability density, and the lateral coordinate X and the vertical coordinate Y indicate the position (X, Y) of the micro point. The distribution map of the microdots in fig. 3(b) is sampled from the random distribution map in fig. 3(a) when the coordinates (x, y) of the microdots are generated.
Fig. 4 shows a flow chart of a method 400 for verifying authenticity of a product according to a second embodiment of the invention. In method 400, an image or picture of a product identification of a product being authenticated is first obtained (step 401). For example, after purchasing a product, a user may take a picture of a product identification portion including a barcode or a two-dimensional code, and transmit an image of the product identification obtained by taking the picture to a verifier or verification device, so as to perform authenticity verification on the obtained image of the product identification. In the present embodiment, the processing for the image containing the two-dimensional code includes two parts, namely, a convolutional neural network algorithm processing part (including step 402-. In the two processing parts, the image needs to be preprocessed (steps 402 and 406), for example, the common preprocessing methods in the image processing technologies such as shading, effective partial interception, contrast enhancement, image sharpening, and image normalization are performed on the partial region of the image containing the effective feature (for example, the two-dimensional code).
In the convolutional neural network algorithm processing part, after the preprocessing step 402, the convolutional neural network algorithm is used to extract the image features (step 403), wherein the image processed by the preprocessing step 402 is used as an input, the convolutional neural network is used as an integral algorithm module, and finally a feature vector is output, namely the image containing the two-dimensional code is quantized into a feature vector. The feature vector may include k floating point numbers (e.g., a floating point sequence of 1x 512), also referred to as a k-dimensional feature vector, which is used to describe printed features, i.e., subtle features of printed product identifiers that are unique during printing due to the use of physical paper, ink, printing equipment, etc., and that may be reflected in the image. In embodiments of the present invention, the first classifier may be trained using positive and negative examples of a plurality of product identifiers and the printed features extracted therefrom. The pre-trained first classifier may analyze the product identifier of the verified product with respect to the image features extracted in step 403 (step 404), for example, the extracted image features may be compared and analyzed with the printed features of the positive sample and the printed features of the negative sample by using a convolutional neural network algorithm, so as to output the probability that the product identifier of the verified product is true or false (step 405).
The convolutional neural network can comprise Linear1, ReLU, Dropout (), Linear2, Linear3 and other layers, and finally Linear3 layer output is used, and because only authenticity classification is focused on here, 2-dimensional vector is output, here, p1 represents true probability, and p2 represents false probability.
The training uses a cross entropy loss function:
H(y,p)=-∑ iy ilog(p i)
wherein y is the true value of the target [ y1, y2], the true label is [1.0,0.0], and the false label is [0.0,1.0 ].
During the training of the first classifier, the loss may be calculated from the known truth of the sample. Stopping the convolutional neural network training when the loss (| y-y' |, i.e., the absolute value of the difference between the true value and the prediction result output by the classifier at this time) is determined to be less than the predetermined threshold; when the loss is judged to be larger than the preset threshold value, the neural network parameters and the parameters of the first classifier can be updated according to the loss value. Then, the updated convolutional neural network and the updated first classifier continue to extract the image features and judge the authenticity probability of the image features. The first classifier may be considered trained when the penalty value continues to drop and settle to a lower penalty value (i.e., threshold).
The positive examples may be two-dimensional code labels of multiple genuine articles, and the negative examples may be duplicates of these positive examples obtained using various ways. The negative sample has the same two-dimensional code label, but its printing characteristic has the detail difference with the two-dimensional code label of genuine goods. Printed features of the product identification cannot be extracted after the convolutional neural network is initialized and before the first classifier is trained. When the positive sample and the negative sample have printing feature differences and other image details are the same, the convolutional neural network can be trained by using the positive sample and the negative sample to identify the printing features of the positive sample and the printing features of the negative sample, such as the type (lines, colors, gray levels, and the like), the position, the difference degree, and the like of the printing features. The convolutional neural network has the capability of quickly and accurately extracting the printing characteristics of the verified product identification after continuous training.
In the micro-point processing part, after the preprocessing step 406, micro-point features in the image are extracted using a micro-point extraction algorithm (step 407), wherein randomly distributed micro-point features in the area where the verified product identification is located can be read using image processing techniques, including statistical data based on at least one of the location, size, color, or grayscale, etc. of the micro-points. For example, the gray scale information of the micro-point region in the image is obtained by counting the size of each micro-point region (e.g., the number of pixels contained in each micro-point), or by counting the average RGB three-channel value of each region. Then, the corresponding micro-point features of the product identification genuine product stored in advance are extracted from the database, and the read micro-point features are compared with the micro-point features in the database (step 408), so that a micro-point feature comparison result is output (step 409) as one of the bases for judging the authenticity of the product identification.
Then, a description vector X is formed by using the authenticity probability of the product identifier output in step 405 and the comparison result of the micro-point features output in step 409 (in which different statistical data are normalized) (step 410). For example, the authenticity probability of the output product identification is taken as a feature dimension such as x 1. The result of the comparison of the micro-point features may include several statistical data obtained after the matching quantization process of the micro-point features in the comparison, for example, the percentage of the micro-point found in the target two-dimensional code matching with the micro-point of the corresponding two-dimensional code in the database is x2, the statistical parameters (such as mean and variance) of the pixel distance of the micro-point on the matching from the micro-point in the database in the image coordinate system are x3 and x4, the penalty (mismatching) of the micro-point without the matching is x5, and the like. With the above information, a 5-dimensional description vector about the verified product identification can be composed.
In the embodiment of the invention, images of product identifications of all collected positive samples and negative samples can be processed to obtain corresponding authenticity probability and micro-point statistical characteristics, so that corresponding description vectors are obtained as a sample data set, wherein one part (such as 80%) of the description vectors can be used as a training set for training a second classifier; another portion (e.g., 20%) may be used as a test set. Based on the sample data set, a more popular machine learning algorithm can be adopted to train and test the second classifier, and the selectable types of the classifiers can be, for example, a Support Vector Machine (SVM), a boosted Tree (Boost Tree), a decision Tree, a shallow neural network, a k-nearest neighbor algorithm, a random forest and the like. The pre-trained second classifier can perform discriminant classification on the verified product identifier based on the descriptive features (including the authenticity probability about the verified product identifier and the micro-point statistical features output by the first classifier) in the descriptive vector obtained in step 410 (step 411), thereby outputting an authenticity judgment result about the verified product identifier (step 412).
In the training image sample preparation process, a plurality of different mobile phones on the market can be used for photographing a plurality of genuine product identifications in different illumination environments, and the obtained images are used as positive samples; taking pictures of the manufactured non-genuine labels by using various different mobile phones on the market under different illumination environments, and taking the obtained images as negative samples; the positive and negative samples are then randomly scaled into a training sample set and a test sample set.
The first classifier and the second classifier can be continuously trained by utilizing the accumulated product identification samples, so that more accurate true and false judgment can be given.
Fig. 5A to 5C show three examples of convolutional neural network structures used in the method for extracting image features of product identification in the embodiment of the present invention. Fig. 5A shows a VGG network, fig. 5B shows a ResNet network structure, and fig. 5C shows an inclusion network structure, respectively showing three ways of extracting image features using a convolutional neural network. The present invention is not limited to these three network architectures.
In fig. 5A to 5C, 501 denotes an input layer in which an image of a preprocessed product identification is input. 502 denotes a convolutional layer, which functions to perform feature extraction on input image data and includes a plurality of convolutional kernels therein, where each element constituting a convolutional kernel corresponds to a weight coefficient and a deviation amount, and is similar to a neuron of a feedforward neural network. Each neuron in the convolutional layer is connected to a plurality of neurons in a closely located region in the previous layer. In the convolutional layer, features are extracted for each small region in an input image, and a plurality of feature maps are obtained by performing convolution using a plurality of filters. Reference numeral 503 denotes a pooling layer, and after feature extraction is performed on the convolutional layer 502, the output feature map is transferred to the pooling layer 503 for feature selection and information filtering. The pooling layer 503 contains a pre-set pooling function whose function is to replace the result of a single point in the feature map with the feature map statistics of its neighboring regions. 504 represents a fully-connected layer, which is equivalent to the hidden layer in a conventional feedforward neural network. The fully-connected layer is located at the last part of the hidden layer of the convolutional neural network and only signals are transmitted to other fully-connected layers. The function of the fully-connected layer is to perform nonlinear combination on the extracted features to obtain output, and the fully-connected layer does not have feature extraction capability per se, but tries to complete a learning target by utilizing the existing high-order features. Reference numeral 505 denotes a residual network module including a combination of a plurality of convolutional layers connected in a hopping manner, which is a building unit of a ResNet network structure. Reference numeral 506 denotes an inclusion module, which is a hidden layer structure in which a plurality of convolutional layers and pooling layers are stacked. Specifically, an inclusion module may simultaneously contain multiple different types of convolution and pooling operations using the same padding to obtain feature maps of the same size, and then superimpose the channels of these feature maps in an array and pass through an excitation function. 510 represents an output layer that outputs the convolutional neural network extracted image features.
An apparatus for verifying authenticity of a product provided according to an embodiment of the present invention may include: a memory to store instructions; and a processor coupled to the memory, the processor when executing the stored instructions being operable to perform the method according to the above-described embodiment of the invention. The memory may also have stored therein a database containing microdot characteristics of genuine product identification stored during or after the manufacture of the product. The micro-point feature may include at least one of a shape feature, a position feature, a gray scale feature, and a color feature of the micro-point.
A sample library may also be stored in the memory of this embodiment, the sample library including a plurality of genuine article identification images as positive samples and a plurality of counterfeit article identification images as negative samples. The processor is configured to train a convolutional neural network with at least a portion of the samples in the sample library, and a first classifier and a second classifier for verifying the product identification.
Fig. 6 is a block diagram illustrating the construction of an apparatus 600 for verifying the authenticity of a product according to an embodiment of the present invention. The apparatus 600 comprises: a micro-point feature extraction module 601, configured to extract a micro-point feature on a product identifier from an image of the product identifier of the verified product; an image feature extraction module 602 for extracting image features of at least a portion of the product identification from the image using a machine learning algorithm; and a verification module 603 for verifying the authenticity of the product identifier of the verified product based on the extracted micro-point features and image features. The apparatus 600 shown in fig. 6 may be implemented by software, hardware or a combination of software and hardware, and may be designed to include corresponding modules to implement the above-described method embodiments for verifying product authenticity of the present invention.
According to the product verification method and the product verification device provided by the embodiment of the invention, the combination of the product identification including the two-dimensional code or the bar code, the micro-point characteristic and the image characteristic is adopted, so that the verification accuracy of the authenticity product is greatly improved, and a user purchasing the product is allowed to use various mobile phones or cameras to shoot the product identification image under various illumination conditions and perform accurate verification.
The embodiments of the invention disclosed above are intended to be illustrative rather than restrictive. Those skilled in the art will appreciate that various modifications, adaptations, and variations may be made to the above disclosed embodiments without departing from the spirit of the invention, and that such modifications, adaptations, and variations are intended to be within the scope of the invention. Accordingly, the scope of the invention should be determined from the following claims.
Claims (16)
- A method for verifying authenticity of a product having a product identifier with randomly distributed microdots thereon, the method comprising:extracting a micro-point feature on a product identifier of a verified product from an image of the product identifier;extracting image features of at least a portion of the product identification from the image using a machine learning algorithm; andverifying authenticity of a product identification of the verified product based on the extracted micro-point feature and the image feature.
- The method of claim 1, wherein the verifying step comprises:and verifying the authenticity of the product identification of the verified product by using a classifier trained by a machine learning algorithm.
- The method of claim 2, wherein the machine learning algorithm used to extract the image features is a convolutional neural network; the machine learning algorithm used to train the classifier is one that can classify feature vectors.
- The method of claim 3, further comprising:training the convolutional neural network and the classifier by using a plurality of genuine product identification images as positive samples and a plurality of counterfeit product identification images as negative samples.
- The method of claim 4, wherein the step of extracting image features comprises:extracting the image features from the image using a trained convolutional neural network to output feature vectors describing the image features.
- The method of claim 2, wherein the extracted image features include at least printed features related to printing of at least a portion of the product identification; the classifier comprises a first classifier which distinguishes authenticity of a product identification of the verified product based on the printed features; wherein the first classifier is trained with positive and negative examples of product identification;wherein the verifying step comprises:outputting, using the trained first classifier, a probability of a product identification of the verified product being true based on the printed features; and/orOutputting, using the trained first classifier, a probability of the product identification of the verified product being false based on the printed features.
- The method according to claim 6, wherein the printed features are features associated with at least one of paper, ink, printing equipment used in the printing of the product identification of the positive sample.
- The method according to claim 6, wherein the classifier further comprises a second classifier that judges authenticity of the product identification based on the authenticity probability of the product identification of the verified product output by the first classifier and the micro-point feature; wherein the second classifier is trained with positive and negative examples of product identification;wherein the verifying step further comprises:comparing the extracted micro-point characteristics with micro-point characteristics of the product identifier which are preserved in advance during or after the anti-counterfeiting product is manufactured;forming a description vector about the product identification based on the comparison result and the authenticity probability of the product identification output by the first classifier; andand judging the authenticity of the product identification of the verified product by using the second classifier based on the description vector.
- The method of claim 8, wherein the description vector includes data relating to at least one of: the matching rate between the extracted micro-point features and the pre-stored micro-point features, the statistical parameters of the pixel distances of the matched micro-points from the pre-stored micro-points in the image coordinate system, the number of the micro-points which are not matched with the pre-stored micro-point features in the image of the product identifier of the verified product, and the quality of the image.
- The method of claim 1, wherein the step of extracting the micro-point features comprises:and extracting at least one of shape features, position features, gray scale features and color features of the micro-points from the image by using an image processing technology.
- The method of claim 1, wherein the product identification comprises at least one of a bar code and a two-dimensional graphic code.
- An apparatus for verifying authenticity of a product having a product identifier with randomly distributed microdots thereon, the apparatus comprising:the micro-point feature extraction module is used for extracting micro-point features on the product identification from the image of the product identification of the verified product;an image feature extraction module to extract image features of at least a portion of the product identification from the image using a machine learning algorithm; andand the verification module is used for verifying the authenticity of the product identification of the verified product based on the extracted micro-point characteristic and the image characteristic.
- An apparatus for verifying authenticity of a product having a product identifier with randomly distributed microdots thereon, the apparatus comprising:a memory for storing instructions; anda processor coupled to the memory, the instructions when executed by the processor causing the processor to perform the method of any of claims 1-11.
- The apparatus of claim 13, wherein said memory further stores a database; the database contains a micro-point signature of a product identifier stored during or after the manufacture of the product, the micro-point signature comprising: at least one of shape feature, position feature, gray scale feature, and color feature of the micro-point.
- The apparatus of claim 13, wherein the memory further stores a sample library comprising a plurality of genuine article identification images as positive samples and a plurality of counterfeit identification images as negative samples; the processor is configured to train a convolutional neural network with at least a portion of the samples in the sample library, and a first classifier and a second classifier for verifying a product identification.
- A computer-readable storage medium having stored thereon executable instructions that, when executed by a computer, cause the computer to perform the method of any one of claims 1 to 11.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/121446 WO2021102770A1 (en) | 2019-11-28 | 2019-11-28 | Method and device for verifying authenticity of product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114746864A true CN114746864A (en) | 2022-07-12 |
Family
ID=76129807
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980102577.4A Pending CN114746864A (en) | 2019-11-28 | 2019-11-28 | Method and apparatus for verifying authenticity of a product |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN114746864A (en) |
DE (1) | DE112019007487T5 (en) |
WO (1) | WO2021102770A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117253262A (en) * | 2023-11-15 | 2023-12-19 | 南京信息工程大学 | Fake fingerprint detection method and device based on commonality feature learning |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113435219B (en) * | 2021-06-25 | 2023-04-07 | 上海中商网络股份有限公司 | Anti-counterfeiting detection method and device, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103208067A (en) * | 2013-03-13 | 2013-07-17 | 张小北 | Anti-counterfeiting system and label forming method, embedding method, reading method, identifying method and ownership changing method thereof |
CN103279731A (en) * | 2013-06-06 | 2013-09-04 | 格科微电子(上海)有限公司 | Two-dimension code anti-fake method and anti-fake verification method thereof |
CN107578250A (en) * | 2017-07-17 | 2018-01-12 | 中国农业大学 | A kind of dimension code anti-counterfeit method and system |
CN108154207A (en) * | 2018-01-24 | 2018-06-12 | 福州本征光电科技有限公司 | The anti-pseudo-unique code generation of one kind and anti-counterfeit authentication method |
CN108509965A (en) * | 2017-02-27 | 2018-09-07 | 顾泽苍 | A kind of machine learning method of ultra-deep strong confrontation study |
CN110414586A (en) * | 2019-07-22 | 2019-11-05 | 杭州沃朴物联科技有限公司 | Antifalsification label based on deep learning tests fake method, device, equipment and medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108470201A (en) * | 2018-01-24 | 2018-08-31 | 重庆延伸科技开发有限公司 | A kind of multiple random color dot matrix label anti-counterfeit system |
-
2019
- 2019-11-28 CN CN201980102577.4A patent/CN114746864A/en active Pending
- 2019-11-28 WO PCT/CN2019/121446 patent/WO2021102770A1/en active Application Filing
- 2019-11-28 DE DE112019007487.3T patent/DE112019007487T5/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103208067A (en) * | 2013-03-13 | 2013-07-17 | 张小北 | Anti-counterfeiting system and label forming method, embedding method, reading method, identifying method and ownership changing method thereof |
CN103279731A (en) * | 2013-06-06 | 2013-09-04 | 格科微电子(上海)有限公司 | Two-dimension code anti-fake method and anti-fake verification method thereof |
CN108509965A (en) * | 2017-02-27 | 2018-09-07 | 顾泽苍 | A kind of machine learning method of ultra-deep strong confrontation study |
CN107578250A (en) * | 2017-07-17 | 2018-01-12 | 中国农业大学 | A kind of dimension code anti-counterfeit method and system |
CN108154207A (en) * | 2018-01-24 | 2018-06-12 | 福州本征光电科技有限公司 | The anti-pseudo-unique code generation of one kind and anti-counterfeit authentication method |
CN110414586A (en) * | 2019-07-22 | 2019-11-05 | 杭州沃朴物联科技有限公司 | Antifalsification label based on deep learning tests fake method, device, equipment and medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117253262A (en) * | 2023-11-15 | 2023-12-19 | 南京信息工程大学 | Fake fingerprint detection method and device based on commonality feature learning |
CN117253262B (en) * | 2023-11-15 | 2024-01-30 | 南京信息工程大学 | Fake fingerprint detection method and device based on commonality feature learning |
Also Published As
Publication number | Publication date |
---|---|
DE112019007487T5 (en) | 2022-03-31 |
WO2021102770A1 (en) | 2021-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Ramachandra et al. | Towards making morphing attack detection robust using hybrid scale-space colour texture features | |
US20190236614A1 (en) | Artificial intelligence counterfeit detection | |
LeCun et al. | Learning methods for generic object recognition with invariance to pose and lighting | |
CN110998598A (en) | Detection of manipulated images | |
Huang et al. | Multiple features learning for ship classification in optical imagery | |
WO2021179157A1 (en) | Method and device for verifying product authenticity | |
de Souza et al. | On the learning of deep local features for robust face spoofing detection | |
CN110427972B (en) | Certificate video feature extraction method and device, computer equipment and storage medium | |
CN109740572A (en) | A kind of human face in-vivo detection method based on partial color textural characteristics | |
CN114444566B (en) | Image forgery detection method and device and computer storage medium | |
CN114746864A (en) | Method and apparatus for verifying authenticity of a product | |
Berenguel et al. | Evaluation of texture descriptors for validation of counterfeit documents | |
Ibarra-Vazquez et al. | Brain programming is immune to adversarial attacks: Towards accurate and robust image classification using symbolic learning | |
CN113468954B (en) | Face counterfeiting detection method based on local area features under multiple channels | |
Akram et al. | Weber Law Based Approach forMulti-Class Image Forgery Detection. | |
EP3982289A1 (en) | Method for validation of authenticity of an image present in an object, object with increased security level and method for preparation thereof, computer equipment, computer program and appropriate reading means | |
Sowmya et al. | Significance of processing chrominance information for scene classification: a review | |
Khuspe et al. | Robust image forgery localization and recognition in copy-move using bag of features and SVM | |
Sabeena et al. | Digital image forgery detection using local binary pattern (LBP) and Harlick transform with classification | |
Harris et al. | An Improved Signature Forgery Detection using Modified CNN in Siamese Network | |
Abraham | Digital image forgery detection approaches: A review and analysis | |
CN110415424B (en) | Anti-counterfeiting identification method and device, computer equipment and storage medium | |
Loke | A novel approach to texture recognition combining deep learning orthogonal convolution with regional input features | |
KR20200051903A (en) | Fake fingerprint detection method and system | |
Theresia et al. | Image Forgery Detection of Spliced Image Class in Instant Messaging Applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |